patch
stringlengths
18
160k
callgraph
stringlengths
4
179k
summary
stringlengths
4
947
msg
stringlengths
6
3.42k
@@ -356,7 +356,7 @@ class DownpourOptimizer(DistributedOptimizer): parameter_list, no_grad_set, self._strategy) - + opt_info["mpi_rank"] = fleet._role_maker._get_rank() fleet._set_opt_info(opt_info) programs = [loss.block.program for loss in losses]
[PSLib->[shrink_dense_table->[shrink_dense_table],shrink_sparse_table->[shrink_sparse_table],init_worker->[init_worker],run_server->[init_server,run_server]],DownpourOptimizer->[minimize->[_set_opt_info]],PSLib]
Minimizes a program through loss and parameters of a .
maybe trainer_rank or trainer_id is betterrole_maker can have many types (mpi or paddlecloud, etc.) and all of them can get_rank
@@ -317,6 +317,11 @@ function GetContacts($results) } } + # Copy all email alerts to default contact if configured. + if ($config['alert']['default_copy'] === true) { + $tmp_contacts[$config['alert']['default_mail']] = ''; + } + # Send email to default contact if no other contact found if ((count($tmp_contacts) == 0) && ($config['alert']['default_if_none']) && (!empty($config['alert']['default_mail']))) { $tmp_contacts[$config['alert']['default_mail']] = '';
[ExtTransports->[deliverAlert],GetContacts->[validateAddress,getUserlevel,getUserlist]]
Get contacts from DB get all users who have access level > = 0 and user has access level > = 5 get all contacts if none and default mail.
Before assigning it to $tmp_contacts, you need to check it doesn't exist already :)
@@ -298,7 +298,7 @@ constant-if usage-requirements(mypkg2,32,x86,17,gnu,linux,gcc-6.3,release) : # mypkg if $(__define_targets__) { call-in-project $(mypkg-mod) : lib MyLib1 - : + : ''' + ''' : <name>MyLib1 <search>$(libdirs(mypkg,32,x86,17,gnu,linux,gcc-6.3,release)) $(requirements(mypkg,32,x86,17,gnu,linux,gcc-6.3,release)) : : $(usage-requirements(mypkg,32,x86,17,gnu,linux,gcc-6.3,release)) ;
[B2GeneratorTest->[b2_empty_settings_test->[EnvValues,B2Generator,TestBufferConanOutput,ConanFile,Settings,initialize],b2_test->[append,EnvValues,B2Generator,assertEqual,CppInfo,update,extend,TestBufferConanOutput,ConanFile,items,Settings,loads,initialize]]]
Test for missing missing - unused - unused - unused - unused - unused - unused - unused B2 definitions for Conan packages. A list of all missing missing - missing - missing - missing - missing - missing - missing B2 definitions for Conan packages.
This looks superfluous.
@@ -30,12 +30,12 @@ def train_one_epoch(model, criterion, optimizer, data_loader, device, epoch, arg for i, (image, target) in enumerate(metric_logger.log_every(data_loader, args.print_freq, header)): start_time = time.time() image, target = image.to(device), target.to(device) - with torch.cuda.amp.autocast(enabled=args.amp): + with torch.cuda.amp.autocast(enabled=scaler is not None): output = model(image) loss = criterion(output, target) optimizer.zero_grad() - if args.amp: + if scaler is not None: scaler.scale(loss).backward() if args.clip_grad_norm is not None: # we should unscale the gradients of optimizer's assigned params if do gradient clipping
[load_data->[_get_cache_path],main->[load_data,evaluate,train_one_epoch],get_args_parser,main]
Train one epoch of the model. Update the metrics for the next batch.
Why this change? scaler is not None in case args.amp is true, but having args.amp here makes the code more readable.
@@ -523,7 +523,7 @@ function makeCorrelator(pageViewId, opt_clientId) { /** * Collect additional dimensions for the brdim parameter. * @param {!Window} win The window for which we read the browser dimensions. - * @param {{width: number, height: number}|null} viewportSize + * @param {?{width: number, height: number}} viewportSize * @return {string} * @visibleForTesting */
[No CFG could be retrieved]
The page - view - specific correlator function. Returns the amp - analytics config for a new CSI trigger.
Could you leave me a `TODO(rcebulko): Define Rect extern type` here?
@@ -65,10 +65,12 @@ namespace Dynamo.Utilities /// the bin/nodes directory. Add the types to the searchviewmodel and /// the controller's dictionaries. /// </summary> + /// <param name="nodeDirectories">Directories that contain node assemblies.</param> /// <param name="context"></param> /// <param name="modelTypes"></param> /// <param name="migrationTypes"></param> - public void LoadNodeModelsAndMigrations(string context, out List<TypeLoadData> modelTypes, out List<TypeLoadData> migrationTypes) + public void LoadNodeModelsAndMigrations(IEnumerable<string> nodeDirectories, + string context, out List<TypeLoadData> modelTypes, out List<TypeLoadData> migrationTypes) { var loadedAssembliesByPath = new Dictionary<string, Assembly>(); var loadedAssembliesByName = new Dictionary<string, Assembly>();
[DynamoLoader->[LoadNodeModelsAndMigrations->[OnAssemblyLoaded],LoadNodesFromAssembly->[IsNodeSubType,IsMigration]]]
Load the node assembly and its migrations. Load a node from assembly path.
Additional `nodes` directories are now being passed to `DynamoLoader` instead of it having to depend on some `static` list.
@@ -41,13 +41,14 @@ type baseClient struct { leader atomic.Value // Store as string // PD follower URLs followers atomic.Value // Store as []string - // dc-location -> TSO allocator leader gRPC connection + // addr -> TSO allocator leader gRPC connection clientConns sync.Map // Store as map[string]*grpc.ClientConn // dc-location -> TSO allocator leader URL allocators sync.Map // Store as map[string]string - checkLeaderCh chan struct{} - checkTSODispatcherCh chan struct{} + checkLeaderCh chan struct{} + checkTSODispatcherCh chan struct{} + updateConnectionCtxsCh chan struct{} wg sync.WaitGroup ctx context.Context
[switchLeader->[GetLeaderAddr],updateMember->[scheduleCheckTSODispatcher],switchTSOAllocatorLeader->[getAllocatorLeaderAddrByDCLocation,gcAllocatorLeaderAddr]]
Creates a baseClient for a list of cluster identifiers. newBaseClient creates a new baseClient for the given urls and security options.
Not necessarily the TSO allocator leader?
@@ -19,10 +19,12 @@ exports.config = { directConnect: true, - baseUrl: 'http://localhost:<%= serverPort %>/', + baseUrl: 'http://localhost:8080/', framework: 'mocha', + SELENIUM_PROMISE_MANAGER: false, + mochaOpts: { reporter: 'spec', slow: 3000,
[No CFG could be retrieved]
Config for the module.
Should use the variable
@@ -111,6 +111,7 @@ func Key(namespaceOrName string, nameOpt ...string) client.ObjectKey { } // KeyFromObject obtains the client.ObjectKey from the given metav1.Object. +// Deprecated: use client.ObjectKeyFromObject instead. func KeyFromObject(obj metav1.Object) client.ObjectKey { return Key(obj.GetNamespace(), obj.GetName()) }
[Before,GetOwnerReferences,MatchingLabels,GetAnnotations,ObjectKeyFromObject,NewAccessor,GetDeletionTimestamp,SetLabels,SetAnnotations,MinorError,IsZero,SevereError,ExtractList,Strings,LenList,Error,FormatBool,New,Ok,GetLabels,GetName,Errorf,GetPodLogs,HumanDuration,Accessor,GetNamespace,Since,InNamespace,Until,Infof,Name,Sort,Get,IsListType,SetList,EachListItem,Fprintf,DirectClient,WithTimeout,Sprintf,Client,List,String,IsNotFound,UntilTimeout]
Key creates a new client. ObjectKey from the given parameters. WaitUntilResourceDeleted waits until the resource is deleted.
As this func is now deprecated, shouldn't we replace all of its usage under `extensions/` to `client.ObjectKeyFromObject`?
@@ -39,3 +39,11 @@ func OAuthClientAuthorizationToSelectableFields(obj *OAuthClientAuthorization) f "userUID": obj.UserUID, } } + +// SelfOAuthClientAuthorizationToSelectableFields returns a label set that represents the object +func SelfOAuthClientAuthorizationToSelectableFields(obj *SelfOAuthClientAuthorization) fields.Set { + return fields.Set{ + "metadata.name": obj.Name, + "clientName": obj.ClientName, + } +}
[No CFG could be retrieved]
obj. UserUID.
why is userUID exposed for this type?
@@ -0,0 +1,5 @@ +FactoryGirl.define do + factory :samples_table do + + end +end
[No CFG could be retrieved]
No Summary Found.
Trailing whitespace detected.
@@ -24,8 +24,9 @@ package io.druid.segment; */ public interface ColumnSelectorFactory { - public TimestampColumnSelector makeTimestampColumnSelector(); + public LongColumnSelector makeTimestampColumnSelector(); public DimensionSelector makeDimensionSelector(String dimensionName); public FloatColumnSelector makeFloatColumnSelector(String columnName); + public LongColumnSelector makeLongColumnSelector(String columnName); public ObjectColumnSelector makeObjectColumnSelector(String columnName); }
[No CFG could be retrieved]
Creates a column selector for a timestamp dimension float or object column.
I would just remove this method and use the actual name for the time column (`__time` I believe) This will make it possible to use the time column for other purposes as well.
@@ -1238,15 +1238,11 @@ public class SnapshotManagerImpl extends MutualExclusiveIdsManagerBase implement try { postCreateSnapshot(volume.getId(), snapshotId, payload.getSnapshotPolicyId()); - DataStoreRole dataStoreRole = getDataStoreRole(snapshot, _snapshotStoreDao, dataStoreMgr); + DataStoreRole dataStoreRole = getDataStoreRole(snapshot); SnapshotDataStoreVO snapshotStoreRef = _snapshotStoreDao.findBySnapshot(snapshotId, dataStoreRole); if (snapshotStoreRef == null) { - // The snapshot was not backed up to secondary. Find the snap on primary - snapshotStoreRef = _snapshotStoreDao.findBySnapshot(snapshotId, DataStoreRole.Primary); - if (snapshotStoreRef == null) { - throw new CloudRuntimeException("Could not find snapshot"); - } + throw new CloudRuntimeException("Could not find snapshot"); } UsageEventUtils.publishUsageEvent(EventTypes.EVENT_SNAPSHOT_CREATE, snapshot.getAccountId(), snapshot.getDataCenterId(), snapshotId, snapshot.getName(), null, null, snapshotStoreRef.getPhysicalSize(), volume.getSize(), snapshot.getClass().getName(), snapshot.getUuid());
[SnapshotManagerImpl->[takeSnapshot->[takeSnapshot,postCreateSnapshot],findRecurringSnapshotSchedule->[listPoliciesforVolume],deleteSnapshotPolicies->[deletePolicy],deleteSnapshotDirsForAccount->[deleteSnapshot],start->[deleteSnapshot],allocSnapshot->[supportedByHypervisor,getSnapshotType],sendToPool->[sendToPool],revertSnapshot->[revertSnapshot],persistSnapshotPolicy->[updateSnapshotPolicy],copySnapshotPoliciesBetweenVolumes->[persistSnapshotPolicy],deletePoliciesForVolume->[deletePolicy,listPoliciesforVolume],BackupSnapshotTask->[runInContext->[BackupSnapshotTask]],deleteSnapshot->[deleteSnapshot],supportedByHypervisor->[hostSupportSnapsthotForVolume],postCreateSnapshot->[getSnapshotUserId],cleanupSnapshotsByVolume->[deleteSnapshot],getSnapshotType->[getSnapshotType]]]
Take a snapshot of a volume. count all resources in snapshot.
@slavkap can you include snapshot id / name and location (primary / image) in the error message.
@@ -128,8 +128,13 @@ class PythonToolRequirementsBase(Subsystem): If the tool supports lockfiles, the returned type will install from the lockfile rather than `all_requirements`. """ - if not self.register_lockfile or self.lockfile == "<none>": + + if not self.is_lockfile: return PexRequirements((*self.all_requirements, *extra_requirements)) + + requirements = FrozenOrderedSet([*self.all_requirements, *extra_requirements]) + hex_digest = calculate_invalidation_digest(requirements) + if self.lockfile == "<default>": assert self.default_lockfile_resource is not None return PexRequirements(
[PythonToolRequirementsBase->[interpreter_constraints->[InterpreterConstraints],register_options->[register,super,ValueError],pex_requirements->[PexRequirements,FileContent,read_binary],version->[cast],extra_requirements->[tuple],lockfile->[cast]],PythonToolBase->[main->[parse,cast,is_default,OptionsError,ConsoleScript],register_options->[register,super,isinstance]]]
The requirements to be used when installing the tool.
Some duplication here. You could update `calculate_invalidation_digest` to take `Iterable[str]` instead of `FrozenOrderedSet[str]`, which would allow you to define the `PexRequirements` once and then pass `requirements.req_strings`(?) to the function.
@@ -649,7 +649,16 @@ def compute_merkletree_without(merkletree, lockhash): def create_senddirecttransfer(channel_state, amount, identifier): - our_balance_proof = channel_state.our_state.balance_proof + our_state = channel_state.our_state + partner_state = channel_state.partner_state + + msg = 'caller must make sure there is enough balance' + assert amount <= get_distributable(our_state, partner_state), msg + + msg = 'caller must make sure the channel is open' + assert get_status(channel_state) == CHANNEL_STATE_OPENED, msg + + our_balance_proof = our_state.balance_proof if our_balance_proof: transferred_amount = amount + our_balance_proof.transferred_amount
[register_secret_endstate->[is_locked],handle_block->[is_deposit_confirmed,get_status],send_refundtransfer->[create_sendmediatedtransfer],handle_send_directtransfer->[get_status,send_directtransfer,get_distributable],send_directtransfer->[create_senddirecttransfer],del_lock->[is_known],send_mediatedtransfer->[create_sendmediatedtransfer],handle_channel_withdraw->[del_lock,compute_proof_for_lock,get_lock,is_locked,register_secret],handle_receive_refundtransfer->[handle_receive_mediatedtransfer],get_distributable->[get_amount_locked,get_balance],handle_channel_closed->[get_known_unlocks,get_status,set_closed],handle_channel_newbalance->[is_transaction_confirmed],create_senddirecttransfer->[get_next_nonce],send_unlock->[create_unlock,get_lock,del_lock],create_unlock->[get_next_nonce,compute_merkletree_without,is_known],is_valid_mediatedtransfer->[is_valid_signature],handle_receive_mediatedtransfer->[is_valid_mediatedtransfer],is_valid_directtransfer->[is_valid_signature],handle_receive_secretreveal->[register_secret],handle_channel_settled->[set_settled],handle_receive_directtransfer->[get_current_balanceproof,is_valid_directtransfer],create_sendmediatedtransfer->[get_next_nonce,compute_merkletree_with,get_distributable],events_for_close->[get_status],state_transition->[handle_channel_withdraw,handle_action_close,handle_block,handle_channel_closed,handle_channel_newbalance,handle_receive_directtransfer,handle_send_directtransfer,handle_channel_settled],handle_action_close->[events_for_close],handle_unlock->[del_lock,is_valid_unlock],register_secret->[register_secret_endstate],is_valid_unlock->[is_valid_signature]]
Create a direct transfer.
These assertions here are for a programming error, right? Caller here refers to the function's caller I suppose?
@@ -799,7 +799,7 @@ static int dai_config(struct comp_dev *dev, struct sof_ipc_dai_config *config) } } - return 0; + return dai_set_config(dd->dai, config); } static void dai_cache(struct comp_dev *dev, int cmd)
[No CFG could be retrieved]
find the next config in the list of DAIs find the DAO that is used by the DAO.
I assume there is something else ensuring dai_config is called after the IPC handler finishes everything it wants to do?
@@ -1051,8 +1051,17 @@ def test_resize_antialias(device, dt, size, interpolation, tester): tester.approxEqualTensorToPIL( resized_tensor_f, resized_pil_img, tol=0.5, msg=f"{size}, {interpolation}, {dt}" ) + + accepted_tol = 1.0 + 1e-5 + if interpolation == BICUBIC: + # this overall mean value to make the tests pass + # High value is mostly required for test cases with + # downsampling and upsampling where we can not exactly + # match PIL implementation. + accepted_tol = 15.0 + tester.approxEqualTensorToPIL( - resized_tensor_f, resized_pil_img, tol=1.0 + 1e-5, agg_method="max", + resized_tensor_f, resized_pil_img, tol=accepted_tol, agg_method="max", msg=f"{size}, {interpolation}, {dt}" )
[test_perspective_batch->[_test_fn_on_batch],Tester->[test_autocontrast->[_test_adjust_fn],test_rotate->[_test_fn_on_batch,_test_rotate_all_options],test_adjust_hue->[_test_adjust_fn],test_center_crop->[_test_fn_on_batch],test_adjust_contrast->[_test_adjust_fn],test_equalize->[_test_adjust_fn],test_vflip->[_test_fn_on_batch],test_solarize->[_test_adjust_fn],test_affine->[_test_affine_rect_rotations,_test_affine_translations,_test_affine_square_rotations,_test_affine_all_ops,_test_fn_on_batch,_test_affine_identity_map],test_hsv2rgb->[_test_fn_on_batch],test_hflip->[_test_fn_on_batch],test_rgb2hsv->[_test_fn_on_batch],test_resize->[_test_fn_on_batch],test_adjust_sharpness->[_test_adjust_fn],test_pad->[_test_fn_on_batch],_test_adjust_fn->[_test_fn_on_batch],test_adjust_brightness->[_test_adjust_fn],test_posterize->[_test_adjust_fn],test_rgb_to_grayscale->[_test_fn_on_batch],test_crop->[_test_fn_on_batch],test_adjust_saturation->[_test_adjust_fn],test_invert->[_test_adjust_fn],test_adjust_gamma->[_test_adjust_fn],test_resized_crop->[_test_fn_on_batch]],_get_data_dims_and_points_for_perspective]
Test resize antialias.
Can you share some image examples with me of the image difference between PIL and your implementation? Something like `plt.imshow(pil_interp - tv_interp)` so that I can see what types of differences we are seeing here?
@@ -46,7 +46,7 @@ class Ncdu(Package): def install(self, spec, prefix): configure('--prefix=%s' % prefix, - '--with-ncurses=%s' % spec['ncurses']) + '--with-ncursesw=%s' % spec['ncurses']) make() make("install")
[Ncdu->[install->[make,configure],depends_on,version]]
Installs a new node in the system.
Does this work when `ncurses~wide` is used?
@@ -25,6 +25,7 @@ __all__ = [ import numpy as np +from ...framework import get_default_dtype from ...fluid.dygraph import layers from ...fluid.initializer import Normal from .. import functional as F
[Conv3D->[__init__->[_get_default_param_initializer]],Conv2D->[__init__->[_get_default_param_initializer]]]
Creates a callable object of the type which is used to construct a callable object of Input and Output are in NCHW format.
should be handled internally
@@ -12,6 +12,8 @@ namespace System.IO { public class EnumerationOptions { + private int _maxRecursionDepth = DefaultMaxRecursionDepth; + /// <summary> /// For internal use. These are the options we want to use if calling the existing Directory/File APIs where you don't /// explicitly specify EnumerationOptions.
[No CFG could be retrieved]
Creates an EnumerationOptions object that can be used to enumerate files and directories. Missing. Requires a permission.
For consistency can you please set this field in the ctor.
@@ -71,4 +71,10 @@ describes.realWin('adsenseDelayedFetch', {}, env => { hid: pageViewId, }); }); + + it('throws on invalid responsive ad unit height', () => { + const data = {'autoFormat': 'rspv', 'height': '666'}; + expect(() => adsense(env.win, data)).to.throw( + /Specified height 666 in <amp-ad> tag is not equal to the required/); + }); });
[No CFG could be retrieved]
The pageViewId is the id of the page view that is being displayed.
No need for brackets, this can be: expect(() => adsense(env.win, data)).to.throw(...)
@@ -65,6 +65,7 @@ def create_chromium_webdriver() -> WebDriver: options.add_argument("--hide-scrollbars") options.add_argument("--force-device-scale-factor=1") options.add_argument("--force-color-profile=srgb") + options['loggingPrefs'] = {'browser': 'ALL'} return webdriver.Chrome(options=options) #-----------------------------------------------------------------------------
[_WebdriverState->[_create->[create_firefox_webdriver,create_chromium_webdriver,_try_create_chromium_webdriver,_try_create_firefox_webdriver],reset->[terminate],get->[reset],cleanup->[reset,terminate]],_try_create_chromium_webdriver->[create_chromium_webdriver],_try_create_firefox_webdriver->[create_firefox_webdriver],cleanup,_WebdriverState]
Creates a Chrome webdriver with the given options.
This won't have an effect, because in tests there is a pytest plugin that manages webdrivers (`bokeh._testing.plugins.selenium`).
@@ -193,8 +193,16 @@ class GradersController < ApplicationController if grader_ids.blank? render text: I18n.t('assignment.group.select_a_grader'), status: 400 else + if params[:skip_empty_submissions] == 'true' + @found_empty_submission = false; + grouping_ids = find_empty_submissions(grouping_ids) + end randomly_assign_graders(grouping_ids, grader_ids) - head :ok + if @found_empty_submission == true + render text: I18n.t('assignment.group.group_submission_no_files'), status: 200 + else + head :ok + end end end when 'criteria_table'
[GradersController->[criteria_with_assoc->[where,includes],csv_upload_grader_groups_mapping->[redirect_to,nil?,assign_tas_by_csv,size,read,post?,join,t],randomly_assign_graders->[randomly_assign_tas],groups_coverage_dialog->[render,find],set_assign_criteria->[head,save,assign_graders_to_criteria,find],populate->[all,get_groups_table_info_no_criteria,find,render,get_groups_table_info_with_criteria,get_graders_table_info_with_criteria,get_graders_table_info_no_criteria,assign_graders_to_criteria],assign_all_graders->[assign_all_tas],unassign_graders->[id,find,unassign_tas,map],randomly_assign_graders_to_criteria->[randomly_assign_tas],global_actions->[render,find,uniq,unassign_graders_from_criteria,assign_all_graders_to_criteria,blank?,head,t,assign_all_graders,unassign_graders,randomly_assign_graders,randomly_assign_graders_to_criteria],download_grader_criteria_mapping->[criteria_with_assoc,send_data,find,user_name,generate,each,get_name,push],groupings_with_assoc->[where,includes],index->[t,find,size],grader_criteria_dialog->[render,find],add_grader_to_grouping->[add_tas,find,head,criterion,each,save,map],csv_upload_grader_criteria_mapping->[redirect_to,nil?,assign_tas_by_csv,find,size,read,post?,join,t],unassign_graders_from_criteria->[unassign_tas],download_grader_groupings_mapping->[send_data,find,user_name,group_name,generate,groupings_with_assoc,each,push],assign_all_graders_to_criteria->[assign_all_tas],include,before_filter]]
Displays all global actions and any associated criterion that are associated with the user. Randomly assign a grader to a criterion and return a 200 if the criterion is not.
Do not use semicolons to terminate expressions.
@@ -316,8 +316,6 @@ namespace Dynamo.Views Canvas.SetRight(view, 0); } - private double currentNodeCascadeOffset = 0.0; - void vm_RequestNodeCentered(object sender, EventArgs e) { ModelEventArgs args = e as ModelEventArgs;
[WorkspaceView->[vm_ZoomAtViewportPoint->[AdjustZoomForCurrentZoomAmount,Point],vm_ZoomToFitView->[vm_ZoomChanged,vm_CurrentOffsetChanged],ZoomAtViewportPoint->[vm_ZoomChanged,vm_CurrentOffsetChanged],vm_CurrentOffsetChanged->[Point]]]
This method is called when a node is added to the outer canvas. It is called when.
moved to the top where all class fields are declared
@@ -319,13 +319,11 @@ func patchAdmissionResponse(resp *v1beta1.AdmissionResponse, patchBytes []byte) resp.PatchType = &pt } -func patchMutatingWebhookConfiguration(cert certificate.Certificater, meshName, osmNamespace, webhookConfigName string, clientSet kubernetes.Interface) error { - if err := hookExists(clientSet, webhookConfigName); err != nil { - log.Error().Err(err).Msgf("Error getting MutatingWebhookConfiguration %s", webhookConfigName) - } - updatedWH := admissionv1beta1.MutatingWebhookConfiguration{ +// getPartialMutatingWebhookConfiguration returns only the portion of the MutatingWebhookConfiguration that needs to be updated. +func getPartialMutatingWebhookConfiguration(cert certificate.Certificater, webhookName string) admissionv1beta1.MutatingWebhookConfiguration { + return admissionv1beta1.MutatingWebhookConfiguration{ ObjectMeta: metav1.ObjectMeta{ - Name: webhookConfigName, + Name: webhookName, }, Webhooks: []admissionv1beta1.MutatingWebhook{ {
[mutateHandler->[Write,Msg,Error,Sprintf,Marshal,Msgf,Decode,ReadAll,mutate,Get,Info,Err,Debug],healthHandler->[Write,Error,Msgf,Err,WriteHeader],isNamespaceAllowed->[IsMonitoredNamespace],mustInject->[Msg,Error,Msgf,isNamespaceAllowed,GetNamespace,Info,Err],run->[ListenAndServeTLS,Msg,Error,Sprintf,Msgf,Background,WithCancel,Shutdown,X509KeyPair,GetCertificateChain,Info,HandleFunc,Err,GetPrivateKey],mutate->[Msg,Error,Msgf,New,Unmarshal,createPatch,mustInject,String,Info,Err],Patch,IssueCertificate,Msgf,AdmissionregistrationV1beta1,Info,Error,Marshal,NewCodecFactory,Errorf,GetCertificateChain,Trace,MutatingWebhookConfigurations,UniversalDeserializer,ToLower,Get,Err,NewScheme,Sprintf,Background,CommonName,run]
patchAdmissionResponse returns a response with the specified mutations. configures the MutatingWebhookConfiguration.
Could you rename this to webhookConfigName? This was intentionally changed from webhookName previously to avoid confusing the mutatingwebhookconfiguration name from the webhook (L333) name.
@@ -425,6 +425,8 @@ set_abt_max_num_xstreams(int n) return 0; } +static bool d_abt_on; + static int abt_init(int argc, char *argv[]) {
[int->[setenv,d_tm_set_counter,crt_finalize,printf,hwloc_bitmap_asprintf,hwloc_bitmap_set,dss_engine_metrics_init,drpc_fini,daos_errno2der,nanosleep,d_tm_record_timestamp,dss_self_rank,hwloc_get_nbobjs_by_depth,hwloc_get_type_depth,pl_fini,drpc_init,server_init_state_fini,crt_register_hlc_error_cb,d_tm_init,hwloc_topology_load,D_ERROR,dss_set_start_epoch,crt_register_event_cb,free,set_abt_max_num_xstreams,daos_debug_set_id_cb,dss_module_init_all,strnlen,ABT_init,ABT_cond_create,pl_init,server_init_state_wait,dss_module_load,daos_fini,ABT_mutex_free,daos_debug_init,strsep,D_PRINT,modules_load,crt_hlc_get,dss_topo_init,daos_hhash_fini,dss_tgt_nr_get,dss_xstreams_open_barrier,D_GOTO,dss_module_fini,D_ASSERT,atoi,hwloc_get_obj_by_depth,crt_hlc_epsilon_get_bound,d_hhash_set_ptrtype,daos_crt_init_opt_get,strlen,abt_init,abt_max_num_xstreams,D_WARN,dbtree_class_register,dss_module_unload_all,D_FREE,d_tm_fini,crt_hlc2nsec,hwloc_bitmap_isincluded,D_STRNDUP,register_dbtree_classes,dss_abterr2der,D_INFO,dss_engine_metrics_fini,max,gethostname,ds_iv_fini,getenv,getopt_long,crt_init_opt,DP_RC,daos_init,drpc_notify_ready,dss_srv_init,dss_module_setup_all,dss_module_init,hwloc_bitmap_alloc,daos_hhash_init,sprintf,D_ASPRINTF,sigaction,strcmp,ABT_mutex_create,hwloc_topology_init,d_getenv_bool,server_init_state_init,dss_ctx_nr_get,usage,hwloc_get_nbobjs_by_type,D_ASSERTF,ds_iv_init,daos_debug_fini,getpid,snprintf,dss_srv_fini,abt_fini],main->[sigwait,strerror,sigaddset,sigfillset,pthread_sigmask,sigemptyset,fopen,server_init,fprintf,daos_register_sighand,localtime,gettimeofday,perror,dss_dump_ABT_state,parse,sigdelset,D_ERROR,ABT_info_trigger_print_all_thread_stacks,getpid,server_fini,snprintf,exit,D_INFO],dss_self_rank->[D_ASSERTF,crt_group_rank,DP_RC],get_module_info->[dss_get_module_info],void->[dss_module_fini,strerror,backtrace_symbols_fd,ds_iv_fini,d_log_sync,dss_engine_metrics_fini,daos_fini,backtrace,ABT_mutex_free,crt_finalize,ds_notify_ras_eventf,DP_RC,d_tm_inc_counter,fprintf,drpc_fini,ABT_mutex_lock,d_tm_record_timestamp,ABT_cond_free,fileno,D_DEBUG,dss_module_unload_all,d_tm_fini,ABT_mutex_unlock,pl_fini,dss_get_module_info,sigaction,server_init_state_fini,D_ERROR,ds_notify_swim_rank_dead,crt_unregister_event_cb,dss_module_cleanup_all,ABT_self_get_thread_id,dss_tls_get,memset,daos_debug_fini,daos_hhash_fini,getpid,raise,ABT_cond_wait,ABT_finalize,dss_srv_fini,abt_fini,exit,D_INFO],dss_init_state_set->[ABT_mutex_lock,ABT_mutex_unlock,D_INFO,ABT_cond_broadcast]]
This function is called by DSS_Abt_init and DSS_Abt.
Why did you introduce this ?? Did you trigger some issue (deadlock ?) due to early logging ??
@@ -495,7 +495,7 @@ func (org *User) GetUserRepositories(userID int64, page, pageSize int) ([]*Repos repos := make([]*Repository, 0, pageSize) // FIXME: use XORM chain operations instead of raw SQL. if err = x.Sql(fmt.Sprintf(`SELECT repository.* FROM repository - INNER JOIN team_repo + INNER JOIN team_repo ON team_repo.repo_id = repository.id WHERE (repository.owner_id = ? AND repository.is_private = ?) OR team_repo.team_id IN (%s) GROUP BY repository.id
[GetUserRepositories->[GetUserTeamIDs],GetUserMirrorRepositories->[GetUserTeamIDs],GetUserTeamIDs->[getUserTeams],GetTeams->[getTeams],GetUserTeams->[getUserTeams],GetTeam->[getTeam],getOwnerTeam->[getTeam],GetOwnerTeam->[getOwnerTeam],RemoveOrgRepo->[removeOrgRepo],GetOwnerTeam]
GetUserRepositories returns list of repositories owned by user Returns nil if the node is not a node ID.
Need to test it but I think the space should stay.
@@ -397,6 +397,8 @@ class Boost(Package): b2name = './b2' if spec.satisfies('@1.47:') else './bjam' b2 = Executable(b2name) + if "platform=cray" in spec and spec.satisfies("target=mic_knl"): + b2 = build_env_compilers(b2) jobs = make_jobs # in 1.59 max jobs became dynamic if jobs > 64 and spec.satisfies('@:1.58'):
[Boost->[determine_bootstrap_options->[determine_toolset,bjam_python_line],install->[determine_bootstrap_options,add_buildopt_symlinks,determine_b2_options],determine_b2_options->[determine_toolset,cxxstd_to_flag]]]
Installs a boost library if it is not already installed. Find the missing - free node - config file.
You'll want to wrap the `bootstrap.sh` call above vs. the invocation of `b2`: the goal is to create the `b2` executable using the current architecture (vs. the target arch for the package).
@@ -989,10 +989,10 @@ class NodeRunner: try: api_server.start(api_host, api_port) except APIServerPortInUseError: - print( - 'ERROR: API Address %s:%s is in use. ' - 'Use --api-address <host:port> to specify port to listen on.' % - (api_host, api_port), + click.secho( + f'ERROR: API Address {api_host}:{api_port} is in use. ' + f'Use --api-address <host:port> to specify a different port.', + fg='red', ) sys.exit(1)
[smoketest->[_run_smoketest->[print_step,append_report,run_app],_run_smoketest,print_step,append_report],run_app->[handle_contract_wrong_address,check_discovery_registration_gas,handle_contract_no_code,check_synced,handle_contract_version_mismatch],wait_for_sync->[wait_for_sync_rpc_api,wait_for_sync_etherscan],wait_for_sync_etherscan->[etherscan_query_with_retries],echonode->[EchoNodeRunner],run->[NodeRunner],NodeRunner->[_run_app->[_shutdown_hook,run_app,_startup_hook]]]
Runs the application and returns a list of tasks. Check if a node has a reserved node. Get the application object if it exists.
At this point, the `api_server.start` exception is being handled and `api_server` was not added yet to the `tasks` list. In the line below, you should clean the already started tasks (as done by that `stop_task`+`joinall` in the end of this function), and **then** exit.
@@ -65,6 +65,10 @@ public class FlinkJobServerDriver implements Runnable { printUsage(parser); return; } + //TODO: Expose the fileSystem related options. + // Register standard file systems. + FileSystems.setDefaultPipelineOptions( + PipelineOptionsTranslation.fromProto(Struct.newBuilder().build())); FlinkJobServerDriver driver = fromConfig(configuration); driver.run(); }
[FlinkJobServerDriver->[createJobService->[create],createJobServer->[create],create->[FlinkJobServerDriver],printUsage->[printUsage],main->[ServerConfiguration],createJobInvoker->[create]]]
Main method for the FlinkJobServerDriver.
What's the purpose of these lines? Are the standard filesystems (`file:` and `gs:` in this case) not registered without this call? If so, is there a way to achieve the same effect without instantiating a dummy `PipelineOptions` (especially one that has been constructed from an empty `Struct` proto)?
@@ -1954,7 +1954,7 @@ namespace System else { value = new TimeZoneInfo(match!._id, match._baseUtcOffset, match._displayName, match._standardDisplayName, - match._daylightDisplayName, match._adjustmentRules, disableDaylightSavingTime: false); + match._daylightDisplayName, match._adjustmentRules, disableDaylightSavingTime: false, match.HasIanaId); } } else
[TimeZoneInfo->[TryConvertIanaIdToWindowsId->[TryConvertIanaIdToWindowsId],DateTime->[ClearCachedData],GetHashCode->[GetHashCode],TimeSpan->[GetIsDaylightSavingsFromUtc,GetAdjustmentRuleForTime,GetIsDaylightSavings],TryConvertWindowsIdToIanaId->[TryConvertWindowsIdToIanaId],IsValidAdjustmentRuleOffset->[UtcOffsetOutOfRange],IsAmbiguousTime->[IsAmbiguousTime],GetIsDaylightSavingsFromUtc->[GetAdjustmentRuleForTime],TimeZoneInfoResult->[Equals],GetAdjustmentRuleForTime->[GetAdjustmentRuleForTime],Equals->[Equals],IsDaylightSavingTime->[IsDaylightSavingTime],Equals]]
Try to find a time zone in the local machine.
Are both of these changes getting tested with the test below? Is it possible to test both of them?
@@ -185,6 +185,11 @@ class Trainer(TrainerBase): Gradients are accumulated for the given number of steps before doing an optimizer step. This can be useful to accommodate batches that are larger than the RAM size. Refer Thomas Wolf's [post](https://tinyurl.com/y5mv44fw) for details on Gradient Accumulation. + opt_level : `str`, optional, (default = `None`) + Each opt_level establishes a set of properties that govern Amp’s implementation of pure or mixed + precision training. Must be a choice of `"O0"`, `"O1"`, `"O2"`, or `"O3"`. + See the Apex [documentation](https://nvidia.github.io/apex/amp.html#opt-levels-and-properties) for + more details. If `None`, Amp is not used. Defaults to `None`. """ super().__init__(serialization_dir, cuda_device, distributed, local_rank, world_size)
[Trainer->[_validation_loss->[batch_loss],_train_epoch->[rescale_gradients,batch_loss],rescale_gradients->[rescale_gradients],train->[_validation_loss,_train_epoch]]]
Initialize a new object with a base configuration. A list of all the classes that are associated with a specific . This is the default method for all training and training models. It is the default method for Log histograms of the model parameters and activations of a specific node in the model.
@matt-gardner How's this?
@@ -47,6 +47,13 @@ namespace Dynamo.Search.SearchElements private string _description; public override string Description { get { return _description; } } + /// <summary> + /// Group property </summary> + /// <value> + /// Group to which Node belongs to</value> + private string _group; + public string Group { get { return _group; } } + private bool _searchable = true; public override bool Searchable { get { return _searchable; } }
[NodeSearchElement->[Equals->[Equals],GetHashCode->[GetHashCode]]]
Set the searchable flag.
It'd be nice to have more descriptive summary for all the new codes. For example, `Group property` does not tell a better story than `public string Group`. It will be nice to explain what is this property meant for, and then what are some examples we can expect to see.
@@ -110,7 +110,7 @@ func (suite *GatewayTestSuite) TestLicenseUsageNodes() { Name: "my job for aws-api node manager", Tags: []*common.Kv{}, Type: "exec", - Profiles: []string{"https://github.com/vjeffrey/try-inspec-aws-profile/archive/master.tar.gz"}, + Profiles: []string{"https://github.com/chef/automate/raw/master/components/compliance-service/test_data/inspec_profiles/test-aws-profile-2.0.0.tar.gz"}, NodeSelectors: []*jobs.ManagerFilter{&mgrFilter}, } suite.T().Log("Creating job for aws-api node manager, to execute scan job")
[TestLicenseUsageNodes->[GetIds,Now,LicenseUsageNodes,SecretClient,GetEnvironment,Skip,NotEqual,Logf,ComplianceJobsServiceClient,NewReportingServiceClient,Create,Equal,GetReports,NoError,Log,TimestampProto,Require,GetTotal,Sleep,NodeManagerClient,Background,NodesClient,Getenv,GetId,T]]
TestLicenseUsageNodes tests the license usage nodes API call create test job with manual node This is a utility function that creates a job that runs a scan job on the aws -.
I guess we'd be a little safer if this wasn't master, but a commit ref
@@ -348,10 +348,11 @@ def c_function_op(name: str, return_type: RType, c_function_name: str, error_kind: int, + var_arg_type: Optional[RType] = None, steals: StealsDescription = False, priority: int = 1) -> CFunctionDescription: ops = c_function_ops.setdefault(name, []) - desc = CFunctionDescription(name, arg_types, return_type, + desc = CFunctionDescription(name, arg_types, return_type, var_arg_type, c_function_name, error_kind, steals, priority) ops.append(desc) return desc
[call_void_emit->[simple_emit],call_and_fail_emit->[simple_emit],name_emit->[simple_emit],call_negative_bool_emit->[simple_emit],call_emit->[simple_emit],call_negative_magic_emit->[negative_int_emit]]
Create a CFunctionDescription object for a c - function operation.
Add a docstring here as well.
@@ -733,6 +733,15 @@ public class Jenkins extends AbstractCIBase implements DirectlyModifiableTopLeve */ private transient final AdjunctManager adjuncts; + /** + * Interval in seconds between 2 ajax calls. + */ + private int ajaxRefreshInterval = 6; + public int getAjaxRefreshInterval() { return this.ajaxRefreshInterval;} + public void setAjaxRefreshInterval(int interval) { + this.ajaxRefreshInterval = interval; + } + /** * Code that handles {@link ItemGroup} work. */
[Jenkins->[getUser->[get],_cleanUpShutdownTcpSlaveAgent->[add],setNumExecutors->[updateComputerList],getPlugin->[getPlugin],getCategorizedManagementLinks->[all,add],getViewActions->[getActions],getJDK->[getJDKs,get],setViews->[addView],getCloud->[getByName],getStaplerFallback->[getPrimaryView],getStoredVersion->[get],getViews->[getViews],doDoFingerprintCheck->[isUseCrumbs],deleteView->[deleteView],getLabel->[get],_cleanUpInterruptReloadThread->[add],doConfigSubmit->[save,updateComputerList],CloudList->[onModified->[onModified]],doCheckDisplayName->[isNameUnique,isDisplayNameUnique],_cleanUpPersistQueue->[save,add],getLabelAtom->[get],setBuildsAndWorkspacesDir->[isDefaultWorkspaceDir,isDefaultBuildDir],reload->[loadTasks,save,reload,executeReactor],doConfigExecutorsSubmit->[all,updateComputerList],DescriptorImpl->[getDynamic->[getDescriptor],DescriptorImpl],checkRawBuildsDir->[expandVariablesForDirectory],_cleanUpShutdownThreadPoolForLoad->[add],isDisplayNameUnique->[getDisplayName],_cleanUpRunTerminators->[onTaskFailed->[getDisplayName],execute->[run],onTaskCompleted->[getDisplayName],onTaskStarted->[getDisplayName],add],getJobNames->[getFullName,add],doChildrenContextMenu->[add,getViews,getDisplayName],doLogout->[doLogout],getActiveInstance->[get],getNode->[getNode],copy->[copy],shouldShowStackTrace->[getName],updateNode->[updateNode],doSubmitDescription->[doSubmitDescription],doCheckURIEncoding->[doCheckURIEncoding],getItem->[getItem,get],doViewExistsCheck->[getView],getUnprotectedRootActions->[getActions,add],setAgentProtocols->[add],disableSecurity->[setSecurityRealm],onViewRenamed->[onViewRenamed],getDescriptorByName->[getDescriptor],loadConfig->[getConfigFile],getRootUrl->[get],refreshExtensions->[getInstance,add,getExtensionList],getRootPath->[getRootDir],getView->[getView],putItem->[get],_cleanUpShutdownTimer->[add],_cleanUpDisconnectComputers->[run->[add]],getAllThreadDumps->[get,getComputers],createProject->[createProject,getDescriptor],MasterComputer->[doConfigSubmit->[doConfigExecutorsSubmit],hasPermission->[hasPermission],get],createProjectFromXML->[createProjectFromXML],getAgentProtocols->[add],doScript->[getView,getACL],_cleanUpReleaseAllLoggers->[add],isRootUrlSecure->[getRootUrl],EnforceSlaveAgentPortAdministrativeMonitor->[doAct->[forceSetSlaveAgentPort,getExpectedPort],isActivated->[get,getSlaveAgentPortInitialValue],getExpectedPort->[getSlaveAgentPortInitialValue]],setSecurityRealm->[get],getItems->[getItems,add],doCheckViewName->[getView,checkGoodName],removeNode->[removeNode],getSelfLabel->[getLabelAtom],fireBeforeShutdown->[all,add],doSimulateOutOfMemory->[add],restartableLifecycle->[get],expandVariablesForDirectory->[expandVariablesForDirectory,getFullName],_getFingerprint->[get],getManagementLinks->[all],addView->[addView],getPlugins->[getPlugin,getPlugins,add],save->[getConfigFile],getPrimaryView->[getPrimaryView],makeSearchIndex->[get->[getView],makeSearchIndex,add],getNodes->[getNodes],lookup->[get,getInstanceOrNull],getLegacyInstanceId->[getSecretKey],saveQuietly->[save],getLifecycle->[get],getInstanceOrNull->[getInstance],executeReactor->[containsLinkageError->[containsLinkageError],runTask->[runTask]],setNodes->[setNodes],loadTasks->[run->[setSecurityRealm,getExtensionList,getNodes,setNodes,remove,add,loadConfig],add],remove->[remove],getDescriptorOrDie->[getDescriptor],getLabelAtoms->[add],getItemByFullName->[getItemByFullName,getItem],doCreateView->[addView],getExtensionList->[get,getExtensionList],getLabels->[add],restart->[restartableLifecycle],isNameUnique->[getItem],getWorkspaceFor->[all],_cleanUpShutdownPluginManager->[add],getRootDirFor->[getRootDirFor,getRootDir],canDelete->[canDelete],getInstance->[getInstanceOrNull],getFingerprint->[get],getAuthentication->[getAuthentication],doScriptText->[getView,getACL],getDynamic->[getActions],_cleanUpPluginServletFilters->[cleanUp,add],_cleanUpShutdownTriggers->[add],addNode->[addNode],getTopLevelItemNames->[add],MasterRestartNotifyier->[onRestart->[all]],doQuietDown->[doQuietDown],safeRestart->[restartableLifecycle],updateComputerList->[updateComputerList],rebuildDependencyGraphAsync->[call->[get,rebuildDependencyGraph]],getConfiguredRootUrl->[get],_cleanUpAwaitDisconnects->[get,add],readResolve->[getSlaveAgentPortInitialValue],getName,get]]
Adds a top - level item to the sequence.
I don't know how to configure it right now... still in progress
@@ -911,8 +911,10 @@ class TorchGeneratorAgent(TorchAgent, ABC): else len(batch.image) ) if batch.text_vec is not None: + batchsize = batch.text_vec.size(0) beams = [ - self._treesearch_factory(dev).set_context(ctx) for ctx in batch.text_vec + self._treesearch_factory(dev).set_context(self._get_context(batch, batch_idx)) + for batch_idx in range(batchsize) ] else: beams = [self._treesearch_factory(dev) for _ in range(bsz)]
[TorchGeneratorAgent->[_compute_nltk_bleu->[_bleu,_v2t],_init_cuda_buffer->[_dummy_batch],reset_metrics->[_init_and_reset_bleu_scorers],compute_loss->[_model_input],_encoder_input->[_model_input],eval_step->[_compute_fairseq_bleu,_construct_token_losses,decode_forced,_compute_nltk_bleu,reorder_encoder_states,_encoder_input,compute_loss,_v2t],train_step->[_init_cuda_buffer,compute_loss],_generate->[_treesearch_factory,reorder_encoder_states,_encoder_input,reorder_decoder_incremental_state]],TreeSearch->[_get_hyp_from_finished->[_HypothesisTail],advance->[_block_ngrams,select_paths,_HypothesisTail],get_rescored_finished->[_get_hyp_from_finished,_get_pretty_hypothesis,_HypothesisTail]],TorchGeneratorModel->[forward->[decode_forced]]]
Generate a sequence of n - best scores for a single node in a beam. get the top prediction for all the candidates in each beam.
we gotta kill this
@@ -253,3 +253,12 @@ func checkCertsPEM(cfgErr *shared.InvalidConfigError, key, data string) { } } } + +func (samlCfg *ConfigRequest_V1_Saml) setNameIDPolicyDefault() { + if samlCfg == nil { + return + } + if samlCfg.NameIdPolicyFormat == nil { + samlCfg.NameIdPolicyFormat = w.String("urn:oasis:names:tc:SAML:2.0:nameid-format:persistent") + } +}
[SetGlobalConfig->[GetLevel,GetLog,GetV1,GetValue],Validate->[AddInvalidValue,IsEmpty,AddMissingKey,NewInvalidConfigError,GetValue],PrepareSystemConfig->[GetLdap,fixCommonCaseIssues,GetConnectors],AddInvalidValue,Sprintf,Decode,ToLower,Int32,String,Bool]
Employs a nation of a nation of a nation of a n.
What makes this code unique, in that it requires checking for the zero-object of the type? I cannot recall ever seeing this check in all the auth methods we've implemented. (That is, clearly if we got a nil here we would not want to continue. But wouldn't that be true in lots of methods in auth code, too?)
@@ -61,6 +61,16 @@ class ParticleGiDOutputProcess(KratosMultiphysics.Process): self.printed_step_count = 0 self.next_output = 0.0 + # This function can be extended with new deprecated variables as they are generated + def TranslateLegacyVariablesAccordingToCurrentStandard(self, settings): + + if settings.Has('result_file_configuration'): + sub_settings_where_var_is = settings['result_file_configuration'] + old_name = 'output_frequency' + new_name = 'output_interval' + + if DeprecationManager.HasDeprecatedVariable(context_string, sub_settings_where_var_is, old_name, new_name): + DeprecationManager.ReplaceDeprecatedVariableName(sub_settings_where_var_is, old_name, new_name) # Public Functions def ExecuteInitialize(self):
[ParticleGiDOutputProcess->[_get_variable->[_get_attribute],__init__->[__init__]]]
Initialize the object with a base file name and a parameter object. Get the next unused variable in the list.
codacy has a point here, `context_string` is not defined!
@@ -125,13 +125,14 @@ tasks = { 'extract_webextensions_to_git_storage': { 'method': migrate_webextensions_to_git_storage, 'qs': [ - Q(_current_version__files__is_webextension=True, - type__in=( + Q(type__in=( # Ignoring legacy add-ons and lightweight themes amo.ADDON_EXTENSION, amo.ADDON_STATICTHEME, amo.ADDON_DICT, amo.ADDON_LPAPP)) | Q(type=amo.ADDON_SEARCH) - ] + ], + 'distinct': True, + 'allowed_kwargs': ('channel',), }, 'extract_colors_from_static_themes': { 'method': extract_colors_from_static_themes,
[Command->[handle->[get_pks]],get_recalc_needed_filters]
A list of methods that can be used to manage theme previews. A generic class to run a task on a theme.
you can combine `ADDON_SEARCH` into the `type__in` list above now (though I'm not sure there are actually any add-ons of other types anyway)
@@ -23,7 +23,11 @@ namespace System.Timers /// <summary> /// Constructs a new localized sys description. /// </summary> - internal TimersDescriptionAttribute(string description, string defaultValue) : base(SR.GetResourceString(description, defaultValue)) { } + internal TimersDescriptionAttribute(string description, string unused) : base(SR.GetResourceString(description)) + { + // Needed for overload resolution + System.Diagnostics.Debug.Assert(unused == null); + } /// <summary> /// Retrieves the description text.
[TimersDescriptionAttribute->[Format,Description,GetResourceString,All]]
Creates a description attribute that marks a property event extender or extender with a description.
Nit: we generally would have a `using System.Diagnostics;` and then just `Debug.Assert(unused == null)` here.
@@ -2890,6 +2890,9 @@ public class Jenkins extends AbstractCIBase implements DirectlyModifiableTopLeve LogFactory.releaseAll(); theInstance = null; + if (JenkinsJVM.isJenkinsJVM()) { + JenkinsJVMAccess._setJenkinsJVM(oldJenkinsJVM); + } } public Object getDynamic(String token) {
[Jenkins->[getAllItems->[getAllItems],getUser->[get],setNumExecutors->[updateComputerList],getPlugin->[getPlugin],getViewActions->[getActions],getJDK->[getJDKs],getCloud->[getByName],getStaplerFallback->[getPrimaryView],getViews->[getViews],doDoFingerprintCheck->[cleanUp,isUseCrumbs],checkJobName->[getItem,checkGoodName],deleteView->[deleteView],doConfigSubmit->[updateComputerList,save,get,setJDKs],CloudList->[onModified->[onModified]],doCheckDisplayName->[isNameUnique,isDisplayNameUnique],reload->[loadTasks,reload,executeReactor],doConfigExecutorsSubmit->[all,get,updateComputerList],DescriptorImpl->[getDynamic->[getDescriptor],DescriptorImpl],isDisplayNameUnique->[getDisplayName],getJobNames->[getFullName,getAllItems,add],doChildrenContextMenu->[add,getViews,getDisplayName],doLogout->[doLogout],getActiveInstance->[getInstance],getNode->[getNode],copy->[copy],updateNode->[updateNode],cleanUp->[execute->[run],run->[add],all,save,get,cleanUp],doSubmitDescription->[doSubmitDescription],getItem->[getItem,get],doViewExistsCheck->[getView],getUnprotectedRootActions->[getActions,add],disableSecurity->[setSecurityRealm],onViewRenamed->[onViewRenamed],getDescriptorByName->[getDescriptor],refreshExtensions->[getInstance,add,getExtensionList],getRootPath->[getRootDir],getView->[getView],putItem->[get],getAllThreadDumps->[get,getComputers],createProject->[createProject,getDescriptor],MasterComputer->[doConfigSubmit->[doConfigExecutorsSubmit],hasPermission->[hasPermission],getInstance],createProjectFromXML->[createProjectFromXML],doScript->[getView,getACL],isRootUrlSecure->[getRootUrl],setSecurityRealm->[get],doCheckJobName->[checkJobName],getItems->[getItems,add],doCheckViewName->[getView,checkGoodName],removeNode->[removeNode],getSelfLabel->[getLabelAtom],doSimulateOutOfMemory->[add],expandVariablesForDirectory->[expandVariablesForDirectory,getFullName],_getFingerprint->[get],resolveDependantPlugins->[run->[resolveDependantPlugins],add],getManagementLinks->[all],addView->[addView],getPlugins->[getPlugin,getPlugins,add],save->[getConfigFile],getPrimaryView->[getPrimaryView],getDescriptorList->[get],makeSearchIndex->[get->[getView],add],getNodes->[getNodes],lookup->[get,getInstance],getLegacyInstanceId->[getSecretKey],getLifecycle->[get],executeReactor->[runTask->[runTask],run],setNodes->[setNodes],loadTasks->[run->[setSecurityRealm,getExtensionList,getNodes,setNodes,remove,getConfigFile,add],add],remove->[remove],getDescriptorOrDie->[getDescriptor],getLabelAtoms->[add],getItemByFullName->[getItemByFullName,getItem],doCreateView->[addView],getExtensionList->[get,getExtensionList],getLabels->[add],restart->[get],isNameUnique->[getItem],getWorkspaceFor->[all],getRootDirFor->[getRootDirFor,getRootDir],canDelete->[canDelete],getInstance->[getInstance],getFingerprint->[get],getAuthentication->[getAuthentication],doScriptText->[getView,getACL],getDynamic->[getActions],addNode->[addNode],getTopLevelItemNames->[add],MasterRestartNotifyier->[onRestart->[all]],doQuietDown->[doQuietDown],safeRestart->[get],updateComputerList->[updateComputerList],rebuildDependencyGraphAsync->[call->[get,rebuildDependencyGraph]],getName]]
This method is called when the JVM is shutting down. It is called by the JVM when This method is called to find a dynamic action in the system.
BTW is there really any use case for calling `_setJenkinsJVM(false)`? I mean, if the intended purpose of `JenkinsJVM` is to differentiate master from slave VMs, once a master always a master, regardless of whether it is shutting down, restarting, whatever. You potentially need a null check on `getInstance()` anyway, if your code might be run at weird times.
@@ -673,3 +673,18 @@ char *AnchorRegexNew(const char *regex) return ret; } + +bool HasRegexMetaChars(const char *string) +{ + if (!string) + { + return false; + } + + if (strcspn(string, "\\^${}[]().*+?|<>-&") == strlen(string)) + { + return false; + } + + return true; +}
[MatchPolicy->[FullTextMatch],MatchRlistItem->[FullTextMatch],IsRegexItemIn->[FullTextMatch],IsPathRegex->[IsRegex]]
AnchorRegexNew - returns a new string with the given regular expression if it is a match.
Stray thought: strlen() has to traverse the string looking for a '\0' byte, which strcspn already did for us. We could make this test cheaper by writing it as: <code> if (string[strcspn(string, "metachars")] == '\0') /\* i.e. no metachars appear in string */ </code>
@@ -86,6 +86,17 @@ const ADVERTISEMENT_ATTR_NAME = 'ad'; /** @private @const {number} */ const REWIND_TIMEOUT_MS = 350; +/** + * @param {!Element} element + * @return {!Element} + */ +const buildPlayMessageElement = element => + htmlFor(element)` + <button role="button" class="i-amphtml-story-page-play-button"> + <span class="i-amphtml-story-page-play-label"></span> + <span class='i-amphtml-story-page-play-icon'></span> + </button>`; + /** * amp-story-page states. * @enum {number}
[AmpStoryPage->[setState->[PLAYING,dev,NOT_ACTIVE,PAUSED],pauseCallback->[DESKTOP,UI_STATE],constructor->[resolve,timerFor,NOT_ACTIVE,forElement,getStoreService,platformFor,promise,debounce,reject],emitProgress_->[dict,PAGE_PROGRESS,dispatch],unmuteAllMedia->[removeAttribute,unmute,muted],getPreviousPageId->[id,tagName],firstAttachedCallback->[matches],rewindAllMedia_->[rewindToBeginning,currentTime],preloadAllMedia_->[preload],registerAllMedia_->[register],playAllMedia_->[play],maybeApplyFirstAnimationFrame->[resolve],setDistance->[min],hasVideoWithAudio_->[prototype,hasAttribute],pauseAllMedia_->[pause],buildCallback->[upgradeBackgroundAudio],waitForMediaLayout_->[addEventListener,resolve,all,tagName,prototype,LOAD_END,ALL_AMP_MEDIA,signals,readyState],maybeCreateAnimationManager_->[create,hasAnimations],layoutCallback->[all],switchTo_->[dict,SWITCH_PAGE,dispatch],getMediaBySelector_->[scopedQuerySelectorAll,iterateCursor,getFriendlyIframeEmbedOptional,push,win],getAdjacentPageIds->[push],muteAllMedia->[setAttribute,mute,muted],isLayoutSupported->[CONTAINER],markPageAsLoaded_->[PAGE_LOADED,dispatch],delegateVideoAutoplay->[length,delegateAutoplay,toArray],checkPageHasAudio_->[TOGGLE_PAGE_HAS_AUDIO],getDistance->[parseInt],getAllMedia_->[ALL_MEDIA],whenAllMediaElements_->[prototype,all,callbackFn],stopListeningToVideoEvents_->[unlisten],previous->[dispatch,SHOW_NO_PREVIOUS_PAGE_HELP],getAllVideos_->[ALL_VIDEO],markMediaElementsWithPreload_->[setAttribute,prototype],initializeMediaPool_->[dev,getImpl,for,closestBySelector],getNextPageId->[id,tagName],reportDevModeErrors_->[getMode,getLogEntries,dispatch,DEV_LOG_ENTRIES_AVAILABLE],startListeningToVideoEvents_->[prototype,length,listen],BaseElement]]
A custom element which represents a single page of an amp - story. Private methods for the object that holds the animation manager.
Should we at least have the reset class on this? Arguably, the spinner and this could go in a shadow tree
@@ -209,6 +209,16 @@ func CreateServer(ctx context.Context, cfg *config.Config, apiBuilders ...Handle pdAPIPrefix: apiHandler, webPath: http.StripPrefix(webPath, ui.Handler()), } + + if cfg.EnableDashboard { + etcdCfg.UserHandlers[dashboardUIPath] = http.StripPrefix(dashboardUIPath, uiserver.Handler()) + etcdCfg.UserHandlers[dashboardAPIPath] = apiserver.Handler(dashboardAPIPath, &dashboardConfig.Config{ + DataDir: cfg.DataDir, + PDEndPoint: etcdCfg.ACUrls[0].String(), + }) + log.Info("Enabled Dashboard API", zap.String("path", dashboardAPIPath)) + log.Info("Enabled Dashboard UI", zap.String("path", dashboardUIPath)) + } } etcdCfg.ServiceRegister = func(gs *grpc.Server) { pdpb.RegisterPDServer(gs, s)
[SetPDServerConfig->[SetPDServerConfig],Run->[startEtcd,startServer],GetMetaRegions->[GetMetaRegions,GetRaftCluster],leaderLoop->[Name,IsClosed],campaignLeader->[createRaftCluster,Name,Close,stopRaftCluster],GetRaftCluster->[IsClosed],GetLeader->[GetLeader],SetClusterVersion->[SetClusterVersion],DeleteLabelProperty->[DeleteLabelProperty,SetLabelProperty],Close->[Close],GetConfig->[GetStorage],SetLabelProperty->[SetLabelProperty]]
CreateServer creates a server for the Uninitialized pd server. startEtcd starts a new etcd node with the given ID.
Do we need to make it can be dynamic changed? if not, we can remove it and enable it in default, we already have a compiler option. I more want to control some service runtime behavior, maybe do this later.
@@ -70,7 +70,7 @@ public class DocumentPaginatedQuery { @Param(name = "pageSize", required = false, description = "Entries number per page.") protected Integer pageSize; - @Param(name = "queryParams", required = false, description = "Ordered query parameters.") + @Param(name = "queryParams", alias = "searchTerm", required = false, description = "Ordered query parameters.") protected StringList strParameters; @Param(name = "sortBy", required = false, description = "Sort by properties (separated by comma)")
[DocumentPaginatedQuery->[run->[longValue,toArray,singletonMap,hasError,getPageProvider,PaginableDocumentModelListImpl,toString,OperationException,getQueryPageProviderDefinition,getErrorMessage]]]
This class represents a query that returns all the documents that the user has access to. ut query - SELECT all records from the database.
What's the goal of this new alias? Could it be described in the upgrade notes of the ticket?
@@ -116,7 +116,8 @@ QueryData genCarves(QueryContext& context) { LOG(WARNING) << "Error inserting new carve entry into the database: " << s.getMessage(); } else { - Dispatcher::addService(std::make_shared<Carver>(paths, guid)); + auto requestId = Distributed::getCurrentRequestId(); + Dispatcher::addService(std::make_shared<Carver>(paths, guid, requestId)); } } enumerateCarves(results);
[genCarves->[generateNewUUID,getMessage,LOG,ok,size,enumerateCarves,insert,constraints,expandConstraints,begin,getUnixTime,str,resolveFilePattern,put,setDatabaseValue],enumerateCarves->[SQL_TEXT,INTEGER,ok,scanDatabaseKeys,VLOG,BIGINT,what,getDatabaseValue,get<int>,push_back,string>],DECLARE_bool]
Generate a list of carve entries.
Probably not an issue not but if we have multiple threads doing distributed queries, we could get two carves with the same requestID I think.
@@ -500,6 +500,10 @@ duns_create_path(daos_handle_t poh, const char *path, struct duns_attr_t *attrp) return -DER_INVAL; } + if (fs.f_type == FUSE_SUPER_MAGIC) { + backend_dfuse = true; + } + #ifdef LUSTRE_INCLUDE if (fs.f_type == LL_SUPER_MAGIC) { rc = duns_create_lustre_path(poh, path, attrp);
[No CFG could be retrieved]
create a new file if HDF5 container otherwise create a new file if POSIX container if POSIX - IO container create a new directory if POSIX - IO container create a.
where is that set btw? is it by fuse itself?
@@ -416,6 +416,7 @@ public class KafkaSupervisor implements Supervisor ioConfig.getStartDelay(), spec.toString() ); + } catch (Exception e) { if (consumer != null) {
[KafkaSupervisor->[updateCurrentAndLatestOffsets->[updateCurrentOffsets,updateLatestOffsetsFromKafka],emitLag->[getHighestCurrentOffsets],checkpointTaskGroup->[apply->[taskIds],taskIds],discoverTasks->[apply->[TaskData,TaskGroup]],addDiscoveredTaskToPendingCompletionTaskGroups->[TaskData,TaskGroup],buildRunTask->[RunNotice],verifyAndMergeCheckpoints->[taskIds],runInternal->[possiblyRegisterListener,gracefulShutdownInternal,toString],start->[toString],GracefulShutdownNotice->[handle->[handle]],checkTaskDuration->[taskIds],checkCurrentTaskState->[taskIds],getCurrentTotalStats->[getStats,taskIds],createNewTasks->[verifyAndMergeCheckpoints,TaskGroup],checkPendingCompletionTasks->[killTasksInGroup,taskIds],CheckpointNotice->[handle->[addNewCheckpoint]]]]
Starts a new kafka partition. throws exception if any caveats occur.
Please remove unnecessary change.
@@ -471,7 +471,7 @@ const Recording = { return; } - sendAnalyticsEvent(RECORDING_CLICKED); + sendAnalytics(createToolbarEvent('recording.button')); switch (this.currentState) { case JitsiRecordingStatus.ON: case JitsiRecordingStatus.RETRYING:
[No CFG could be retrieved]
Handles clicks on the toolbar button_record. Determines if a recording token is requested and if so sends the appropriate analytics events.
I wonder why this only gets logged when there is no dialog. Maybe it should always get logged and dialog present be an attribute?
@@ -524,6 +524,12 @@ int WbGeometry::triangleCount() const { return 0; } +bool WbGeometry::exportNodeHeader(WbVrmlWriter &writer) const { + if (writer.isUrdf()) + return true; + return WbNode::exportNodeHeader(writer); +} + //////////////////////////////// // Position and orientation // ////////////////////////////////
[computeWrenRenderable->[applyVisibilityFlagToWren,isSelected],destroyWrenObjects->[deleteWrenRenderable],updateBoundingObjectVisibility->[applyVisibilityFlagToWren],matrix->[matrix],WbBaseNode->[init],resizeManipulator->[createResizeManipulatorIfNeeded],attachResizeManipulator->[createResizeManipulatorIfNeeded],setOdeData->[createOdeObjects],setUniformConstraintForResizeHandles->[createResizeManipulatorIfNeeded],WbVector3->[WbVector3],exportBoundingObjectToX3D->[WbVector3],isSelected->[isSelected],propagateSelection->[applyVisibilityFlagToWren],setOdeRotation->[setOdePosition],absolutePosition->[WbVector3],createResizeManipulatorIfNeeded->[checkForResizeManipulator],updateContextDependentObjects->[checkForResizeManipulator]]
Returns the number of triangles in this Geometry.
Is there a specific reason for not calling `WbBaseNode::exportNodeHeader` ?
@@ -45,9 +45,10 @@ namespace Content.Server.StationEvents _failDuration = IoCManager.Resolve<IRobustRandom>().Next(30, 120); var componentManager = IoCManager.Resolve<IComponentManager>(); - foreach (var component in componentManager.EntityQuery<PowerReceiverComponent>()) + foreach (PowerReceiverComponent component in componentManager.EntityQuery<PowerReceiverComponent>()) { component.PowerDisabled = true; + _powered.Add(component.Owner); } }
[PowerGridCheck->[Shutdown->[Shutdown],Startup->[Startup]]]
Override Startup to start power on and power on.
Couldn't you just do _powered[component.Owner] = component.PowerDisabled; so disabled stuff at least doesn't get turned back on for now.
@@ -39,8 +39,12 @@ class SublimeText(Package): return "https://download.sublimetext.com/sublime_text_{0}_build_{1}_x64.tar.bz2".format(version[0], version[-1]) def install(self, spec, prefix): - # Sublime text comes as a pre-compiled binary. install_tree('.', prefix) + src = join_path(prefix, 'sublime_text') + dst = join_path(prefix, 'bin') + mkdirp(dst) + force_symlink(src, join_path(dst, 'sublime_text')) + force_symlink(src, join_path(dst, 'subl')) def setup_environment(self, spack_env, run_env): - run_env.prepend_path('PATH', self.prefix) + run_env.prepend_path('PATH', join_path(self.prefix, "bin"))
[SublimeText->[install->[install_tree],url_for_version->[format],setup_environment->[prepend_path],depends_on,version]]
Installs the Nagios binary.
This method is no longer needed, spack adds `prefix.bin` to the `PATH` by default.
@@ -4,12 +4,12 @@ class TaMembership < Membership after_create { Repository.get_class.update_permissions } after_destroy { Repository.get_class.update_permissions } - def must_be_a_ta - if user && !user.is_a?(Ta) - errors.add('base', 'User must be a ta') - false - end - end + def must_be_a_ta + if user && !user.is_a?(Ta) + errors.add('base', 'User must be a ta') + false + end + end def self.from_csv(assignment, csv_data, remove_existing) if remove_existing
[TaMembership->[from_csv->[nil?,import,empty?,new,update_permissions_after,parse,update_criteria_coverage_counts,pluck,raise,read,assign_graders_to_criteria,delete_all,each,map],must_be_a_ta->[add,is_a?],after_destroy,validate,update_permissions,after_create]]
Returns a base node object if the user is a ta and has no other records otherwise false.
This is a good style change, but doesn't belong in this pull request. Please revert this change here, and then make a separate branch that pull request to fix the indentation for this function.
@@ -196,6 +196,11 @@ public class StatementExecutor implements KsqlConfigurable { executeStatement( statement, command, commandId, commandStatusFuture, mode); } catch (final KsqlException exception) { + final Throwable rootCause = ExceptionUtils.getRootCause(exception); + if (mode == Mode.RESTORE && rootCause instanceof KsqlMissingSourceException) { + ksqlEngine.incrementQueryId(); + } + log.error("Failed to handle: " + command, exception); final CommandStatus errorStatus = new CommandStatus( CommandStatus.Status.ERROR,
[StatementExecutor->[handleStatementWithTerminatedQueries->[putFinalStatus,putStatus],executeStatement->[putFinalStatus]]]
Handles a statement with terminated queries.
Apologies if I've missed something as I'm late to the game here: there are multiple reasons a statement could during recovery, besides a missing source, so I don't believe this approach works. (For example, I originally discovered the recovery bug when a statement failed to execute because Schema Registry rejected a sink schema due to a failed schema compatibility check.) If we're pursuing this fix for legacy query IDs (rather than updating how query IDs are generated and leaving the bug for legacy query IDs, as proposed in the discussion above), then we should always increment the query ID on recovery, regardless of the reason for failure.
@@ -215,7 +215,9 @@ public class GobblinHelixJobLauncher extends AbstractJobLauncher { if (this.jobSubmitted) { try { log.info("[DELETE] workflow {}", this.helixWorkFlowName); - this.helixTaskDriver.delete(this.helixWorkFlowName); + if (this.cancellationRequested) { + this.helixTaskDriver.delete(this.helixWorkFlowName); + } } catch (IllegalArgumentException e) { LOGGER.warn("Failed to cancel job {} in Helix", this.jobContext.getJobId(), e); }
[GobblinHelixJobLauncher->[close->[close],launchJob->[launchJob]]]
Execute cancellation.
Is there a cancellation/fail call? The delete here will mean for normal execution the workflow is retained, but for delete it is immediately removed.
@@ -1316,8 +1316,14 @@ def _grow_nonoverlapping_labels(subject, seeds_, extents_, hemis, vertices_, return labels - +@verbose +@deprecated("_read_annot() will be removed in release 0.9. Use " + "read_annot() instead.") def _read_annot(fname): + read_annot(fname) + + +def read_annot(fname): """Read a Freesurfer annotation from a .annot file. Note : Copied from PySurfer
[write_annot->[_n_colors,_write_annot,_get_annot_fname],parc_from_labels->[copy],BiHemiLabel->[__add__->[_blend_colors,BiHemiLabel]],split_label->[Label,read_label,_split_colors],label_time_courses->[read_label],write_label->[split],read_annot->[Label,_read_annot,_get_annot_fname],stc_to_label->[Label,_n_colors],Label->[morph->[copy],__add__->[_blend_colors,Label]],_grow_nonoverlapping_labels->[Label],_grow_labels->[_verts_within_dist,Label],_read_annot->[split],read_label->[split,Label],grow_labels->[_n_colors]]
Grow labels while ensuring that they don t overlap . Read a Freesurfer label from a. annot file and return a list of label Color table read from file and return the missing values. return an annotation ctab name .
Private functions should not have to be deprecated
@@ -58,6 +58,13 @@ const files = { 'content/css/_vendor.css', 'content/css/_documentation.css' ] + }, + { + condition: generator => !generator.useSass && generator.enableI18nRTL, + path: MAIN_SRC_DIR, + templates: [ + 'content/css/_rtl.css', + ] } ], sass: [
[No CFG could be retrieved]
Package specific functionality for the generation of the package. Config for all of the postcss classes.
Does it mean that enableI18nRTL is a global option for the whole app? If so, can we build an app with both a mix of LTR and RTL languages?
@@ -43,6 +43,12 @@ public class TestAccumuloIntegrationSmokeTest return false; } + @Override + protected TestTable createTableWithDefaultColumns() + { + throw new SkipException("requirement not met"); + } + @Override public void testDescribeTable() {
[TestAccumuloIntegrationSmokeTest->[testDescribeTable->[assertEquals,getField,toTestTypes],createQueryRunner->[of,createAccumuloQueryRunner]]]
This method checks if the base table can be queried for the given sequence number.
What kind of a requirement is not met? Please be more specific.
@@ -26,7 +26,8 @@ class AccountsController < ApplicationController @user.save(validate: false) end - redirect_to account_path, notice: (message.presence || t("messages.accounts.saved")) + I18n.locale = @user.locale + redirect_to account_path, notice: (message.presence || t("messages.accounts.updated")) else render "/accounts/show" end
[AccountsController->[update->[redirect_to,find,render,transaction,attributes,update_column,save,resend_confirmation_instructions,valid?,email_changed?,t,presence,id],permits,before_action]]
Updates a user s unconfirmed email and resend confirmation instructions if the user is.
Line is too long. [92/90]
@@ -381,7 +381,7 @@ namespace Microsoft.Xna.Framework.Graphics { if (_sortMode == SpriteSortMode.Immediate) { - _batcher.DrawBatch(_sortMode); + _batcher.DrawBatch(_sortMode, _effect); } }
[SpriteBatch->[Draw->[Draw,CheckValid],Dispose->[Dispose],Begin->[Begin],DrawString->[CheckValid]]]
Draw the entire object if it is not already done.
Going to have to bass both _spritePass and _effect here.
@@ -190,8 +190,11 @@ class FixedDialogTeacher(Teacher): 'has episodes of multiple examples), or streaming ' 'the data using a streamed data mode if supported.') + clen = opt.get('context_length', -1) + incl = opt.get('include_labels', True) flatdata = flatten(ordered_teacher, context_length=clen, include_labels=incl) + self.sorted_data = sort_data(flatdata) self.batches = make_batches(self.sorted_data, self.bsz) # one fixed-seed shuffle keeps determinism but makes sure that
[StreamDialogData->[reset->[_load],num_examples->[load_length],num_episodes->[load_length],__init__->[get],load_length->[_read_episode],get->[_lock,build_table],_data_generator->[_read_episode]],FbDialogTeacher->[__init__->[get]],DialogData->[build_table->[get],_load->[_read_episode],__init__->[get]],FixedDialogTeacher->[next_example->[next_episode_idx],reset->[_lock],__init__->[DataLoader],act->[get,reset,next_example],next_episode_idx->[_lock],next_batch->[_lock],batch_act->[reset,get,next_batch]],DialogTeacher->[next_example->[get],share->[share],reset->[reset],num_examples->[num_examples],num_episodes->[num_episodes],__init__->[reset,get],get->[get]],ParlAIDialogTeacher->[_setup_data->[get],__init__->[reset,get]],DataLoader->[__init__->[__init__]]]
Initialize the object with the specified options. task_agent_from_taskname - get task agent from ordered_teacher -.
Can we strip these changes?
@@ -40,6 +40,14 @@ #define TESTBENCH_NCH 2 /* Stereo */ +/* shared library look up table */ +struct shared_lib_table lib_table[NUM_WIDGETS_SUPPORTED] = { +{"file", "", SND_SOC_TPLG_DAPM_AIF_IN, "", 0, NULL}, +{"vol", "libsof_volume.so", SND_SOC_TPLG_DAPM_PGA, "sys_comp_volume_init", 0, + NULL}, +{"src", "libsof_src.so", SND_SOC_TPLG_DAPM_SRC, "sys_comp_src_init", 0, NULL}, +}; + /* main firmware context */ static struct sof sof; static int fr_id; /* comp id for fileread */
[No CFG could be retrieved]
This function is called by the udev code to find a unique identifier within a software. Parses the shared library from user input and returns the corresponding component id.
don't set a size here, just use ARRAY_SIZE to iterate over this.
@@ -489,3 +489,11 @@ def check_file_existence(file): return True except (FileNotFoundError, IsADirectoryError): return False + +def get_stride_from_config(config, allow_none=False): + if 'stride' in config: + return config['stride'] + if not allow_none: + raise ValueError('Parameter stride required') + + return None, None
[get_size_from_config->[contains_all],JSONDecoderWithAutoConversion->[_decode->[_decode]],read_yaml->[get_path],get_or_parse_value->[string_to_tuple],OrderedSet->[pop->[discard]],check_file_existence->[get_path],read_csv->[get_path],read_pickle->[get_path],read_json->[get_path],read_txt->[get_path,is_empty],convert_to_range->[string_to_tuple],get_size_3d_from_config->[contains_all]]
Check if file exists and is a file.
As I understand, stride is required argument for your preprocessor, right? What about set it optional=False in parameters? It will be mean that it should be always provided (None is not allowed and it will be checked automatically) BTW why at the end you return tuple with 2 None values?
@@ -37,15 +37,6 @@ import java.util.Map; public interface AlertCondition { String getDescription(); - /** - * The limited list of internal message objects that matched the alert. - * @see org.graylog2.plugin.alarms.AlertCondition.CheckResult#getMatchingMessages() - * @return list of Message objects - */ - @Deprecated - @JsonIgnore - List<Message> getSearchHits(); - String getId(); DateTime getCreatedAt();
[No CFG could be retrieved]
Returns a description of a .
Although the method has been marked as deprecated, this is a breaking change which affects all existing alarm callbacks (and there are actually some plugins providing those), so I would not remove that method in the Graylog 1.x version line.
@@ -2841,8 +2841,9 @@ public class JuniperSrxResource implements ServerResource { action = "<permit></permit>"; } - xml = replaceXmlValue(xml, "action", action); + } + xml = replaceXmlValue(xml, "action", action); } else { xml = replaceXmlValue(xml, "from-zone", fromZone); xml = replaceXmlValue(xml, "to-zone", toZone);
[JuniperSrxResource->[getUnusedApplications->[getXml],commitConfiguration->[closeConfiguration,getXml],manageIkePolicy->[genIkePolicyName,getXml,manageIkePolicy],manageSourceNatPool->[manageSourceNatPool,getXml,genSourceNatPoolName],manageIkeGateway->[genIkeGatewayName,getXml,manageIkeGateway],configure->[UsageFilter],removeStaticAndDestNatRulesInPrivateVlan->[removeDestinationNatRules,removeStaticNatRules],execute->[refreshSrxConnection,commitConfiguration,openConfiguration,getActiveFirewallEgressRules,getIcmpType,FirewallFilterTerm,getIcmpCode,getType,getProtocol,getCounterIdentifier,closeConfiguration,execute,extractCidrs],managePrivateInterface->[getName,getXml,managePrivateInterface],getUsageAnswer->[getXml,closeUsageSocket,openUsageSocket],genPortRangeEntry->[genNameValueEntry],sendUsageRequest->[sendRequestPrim],manageAddressPool->[manageAddressPool,getXml,genAddressPoolName],sendRequest->[sendRequestPrim],addEgressSecurityPolicyAndApplications->[manageApplication,genApplicationName,manageAddressBookEntry,manageSecurityPolicy],openUsageSocket->[usageLogin],manageSourceNatRule->[genSourceNatPoolName,manageSourceNatPool,manageSourceNatRule,genSourceNatRuleName,getXml],manageDestinationNatRule->[manageDestinationNatPool,genDestinationNatRuleName,manageDestinationNatRule,genDestinationNatPoolName,getXml],getUsageFilter->[getCounterIdentifier],parseApplicationName->[getProtocol],removeDestinationNatRules->[removeDestinationNatRule],removeStaticNatRules->[removeStaticNatRule],genNameValueEntry->[getXml],genIcmpEntries->[genNameValueEntry],getVpnObjectNames->[getXml],getDestNatRules->[getXml],manageApplication->[manageApplication,genApplicationName,getXml],sendRequestAndCheckResponse->[checkResponse,sendRequest],manageDestinationNatPool->[manageDestinationNatPool,getXml,genDestinationNatPoolName],genSecurityPolicyName->[getIdentifier],getPublicVlanTagsForPublicIps->[getXml],getPublicVlanTagsForNatRules->[getPublicVlanTagsForPublicIps],populatePublicVlanTagsMap->[getVlanTagFromInterfaceName],manageDynamicVpnClient->[manageDynamicVpnClient,getXml,genDynamicVpnClientName],setDelete->[replaceXmlValue,replaceXmlTag],manageStaticNatRule->[getXml,genStaticNatRuleName,manageStaticNatRule],manageAddressBookEntry->[manageAddressBookEntry,getXml,genAddressBookEntryName],manageIpsecVpn->[genIpsecVpnName,getXml,genIkeGatewayName,manageIpsecVpn],manageZoneInterface->[getXml,genZoneInterfaceName,manageZoneInterface],extractApplications->[getIcmpType,getIcmpCode,getProtocol],manageSecurityPolicy->[getDestNatRules,getStaticNatRules,genSecurityPolicyName,genAddressBookEntryName,manageAddressBookEntry,manageSecurityPolicy,getXml],addSecurityPolicyAndApplications->[manageApplication,genApplicationName,manageSecurityPolicy],manageFirewallFilter->[getSourceCidrs,getCountName,getPortRange,getName,genMultipleEntries,getIcmpType,getIcmpCode,getProtocol,getDestIp,genPortRangeEntry,manageFirewallFilter,getXml,genIcmpEntries],manageProxyArp->[getXml,manageProxyArp],getStaticNatRules->[getXml],getUsageAnswerKey->[getGuestVlanTag,getIpAddress],removeSecurityPolicyAndApplications->[manageApplication,getUnusedApplications,getApplicationsForSecurityPolicy,parseApplicationName,manageSecurityPolicy],openConfiguration->[getXml],manageUsageFilter->[getName,manageUsageFilter,getXml,getAddressType],genApplicationName->[getIdentifier],login->[getXml],getApplicationsForSecurityPolicy->[getXml],genMultipleEntries->[genNameValueEntry],updateUsageAnswer->[getBytesMap,updateBytesMap,getUsageFilter,getUsageAnswerKey],closeConfiguration->[getXml],sendUsageRequestAndCheckResponse->[sendUsageRequest,checkResponse],manageAccessProfile->[manageAccessProfile,getXml,genAccessProfileName],usageLogin->[getXml],removeEgressSecurityPolicyAndApplications->[manageApplication,getUnusedApplications,getApplicationsForSecurityPolicy,parseApplicationName,manageAddressBookEntry,manageSecurityPolicy]]]
This method is used to manage a security policy. Checks if the rule is in the chain. This method is used to add security policy to a private ip. This method deletes the security policy if it exists.
Why did you put it out of "if-then-else"? Shouldn't it only be executed only if "!type.equals(SecurityPolicyType.SECURITYPOLICY_EGRESS_DEFAULT)"?
@@ -137,6 +137,10 @@ class SystemCollectionManager implements SystemCollectionManagerInterface return; } + if (!$token->getUser() instanceof UserInterface) { + return; + } + return $token->getUser()->getId(); }
[SystemCollectionManager->[iterateOverCollections->[iterateOverCollections]]]
get user id.
Why not extracting `$token->getUser()` to `$user` beforehand?
@@ -65,8 +65,13 @@ public class LoginHelper { private static Logger log = Logger.getLogger(LoginHelper.class); private static final String DEFAULT_KERB_USER_PASSWORD = "0"; - private static final Long MIN_PG_DB_VERSION = 90600L; - private static final String MIN_PG_DB_VERSION_STRING = "9.6"; + private static final Long MIN_PG_DB_VERSION = 100001L; + private static final Long MAX_PG_DB_VERSION = 130000L; + private static final String MIN_PG_DB_VERSION_STRING = "10"; + private static final String MAX_PG_DB_VERSION_STRING = "12"; + private static final Double OS_VERSION_CHECK = 15.2; + private static final Long OS_VERSION_MIN_DB_VERSION = 120000L; + private static final String OS_VERSION_WANTED_DB_VERSION = "12"; public static final String DEFAULT_URL_BOUNCE = "/rhn/YourRhn.do"; /**
[LoginHelper->[checkExternalAuthentication->[getSgsFromExtGroups,getMessage,equals,CreateUserCommand,isDisabled,getRemoteUser,warn,lookupByLogin,setRawPassword,getRolesFromExtGroups,setFirstNames,isEmpty,setEmail,getExtGroups,getAttribute,error,getSatConfigLongValue,setLastName,updateUser,decodeFromIso88591,setOrg,getUser,storeNewUser,getName,setLogin,validate,setTemporaryRoles,rollbackTransaction,getSatConfigBooleanValue,UpdateUserCommand,lookupById,add,setServerGroups,lookupByName],getSgsFromExtGroups->[info,getServerGroups,addAll,lookupOrgExtGroupByLabelAndOrg],getRolesFromExtGroups->[info,lookupExtGroupByLabel,addAll,getRoles],loginUser->[loginUser,getMessage,add],updateUrlBounce->[trim,equals,endsWith,isBlank,startsWith],validateDBVersion->[replaceAll,split,getMessage,error,size,equals,debug,isPostgresql,get,getMode,getInstance,isUyuni,isDebugEnabled,add,readStringFromFile,valueOf,execute],isSchemaUpgradeRequired->[debug,size,equals,get,getMode,isDebugEnabled,getRpmSchemaVersion,execute],getRpmSchemaVersion->[replace,SystemCommandExecutor,execute],publishUpdateErrataCacheEvent->[UpdateErrataCacheEvent,debug,start,stop,getTime,isDebugEnabled,setOrgId,publish,StopWatch,getId],successfulLogin->[updateWebUserId,setLastLoggedIn,Date,getOrg,getId,publishUpdateErrataCacheEvent,storeUser],getExtGroups->[getAttribute,toArray,join,warn,add,parseLong],decodeFromIso88591->[getBytes,String,warn],getLogger]]
Creates a login helper class. Checks if the given is externally authenticated.
Here it is 12, but the Long is 130000 . Is this intentional?
@@ -310,6 +310,11 @@ public class RepositoryDescriptor { return fulltextDescriptor; } + @XNode("indexing/fulltext@fieldSizeLimit") + public void setFulltextFieldSizeLimit(int fieldSizeLimit) { + fulltextDescriptor.setFulltextFieldSizeLimit(fieldSizeLimit); + } + @XNode("indexing/fulltext@disabled") public void setFulltextDisabled(boolean disabled) { fulltextDescriptor.setFulltextDisabled(disabled);
[RepositoryDescriptor->[FieldDescriptor->[copyList->[FieldDescriptor],setXNodeContent->[setName]],setPool->[setName],getArrayColumns->[defaultFalse],setFulltextIndexes->[setFulltextIndexes],getProxiesEnabled->[defaultTrue],setFulltextParser->[setFulltextParser],merge->[merge],setFulltextDisabled->[setFulltextDisabled],getCachingMapperEnabled->[defaultTrue],getClusteringEnabled->[defaultFalse],setFulltextIncludedTypes->[setFulltextIncludedTypes],getPathOptimizationsEnabled->[defaultTrue],getSoftDeleteEnabled->[defaultFalse],setFulltextExcludedTypes->[setFulltextExcludedTypes],getNoDDL->[defaultFalse],getAclOptimizationsEnabled->[defaultTrue],setFulltextSearchDisabled->[setFulltextSearchDisabled],copyList]]
getFulltextDescriptor - get FulltextDescriptor.
Please add similar changes to `DBSRepositoryDescriptor` using `@XNode("fulltext@fieldSizeLimit")`.
@@ -173,6 +173,7 @@ public class DescribeConnectorExecutorTest { // Then: verify(engine).getMetaStore(); + verify(engine).metricCollectors(); verify(metaStore).getAllDataSources(); verify(connectClient).status("connector"); verify(connectClient).describe("connector");
[DescribeConnectorExecutorTest->[shouldNotWarnClientOnMissingTopicsEndpoint->[getTopics,status,getWarnings,assertThat,thenReturn,describe,failure,isPresent,get,topics,empty,getEntity,getMetaStore,instanceOf,getAllDataSources],teardown->[verifyNoMoreInteractions],shouldDescribeKnownConnector->[status,getTopic,assertThat,size,thenReturn,describe,isPresent,getName,get,topics,success,getEntity,getMetaStore,getConnectorClass,is,instanceOf,getStatus,getAllDataSources],shouldErrorIfConnectClientFailsDescribe->[status,thenReturn,describe,failure,isPresent,get,getEntity,is,instanceOf,assertThat,getErrorMessage],shouldWorkIfUnknownConnector->[status,describe,isPresent,get,empty,getEntity,getConnectorClass,is,getStatus,instanceOf,assertThat,DescribeConnectorExecutor,getSources],setUp->[nonWindowed,KsqlTopic,DescribeConnector,of,thenReturn,build,empty,KsqlConfig,success,DescribeConnectorExecutor,name],shouldDescribeKnownConnectorIfTopicListFails->[status,assertThat,size,thenReturn,describe,failure,isPresent,get,topics,getEntity,getConnectorClass,getMetaStore,is,instanceOf,getAllDataSources],shouldErrorIfConnectClientFailsStatus->[status,thenReturn,isPresent,failure,get,getEntity,is,instanceOf,assertThat,getErrorMessage],singletonList,of,singletonMap,ConnectorState,ConnectorStateInfo,ConnectorInfo,TaskState]]
Checks if there is a known connector with a specific ID.
it's a bit silly that we have to verify interactions we don't care about, but I've fought enough battles for today.
@@ -35,3 +35,13 @@ class PerlXmlParser(PerlPackage): version('2.44', 'af4813fe3952362451201ced6fbce379') depends_on('expat') + + def configure_args(self): + args = [] + + el = join_path(self.spec['expat'].prefix, 'lib') + ei = join_path(self.spec['expat'].prefix, 'include') + args.append('EXPATLIBPATH={p}'.format(p=el)) + args.append('EXPATINCPATH={p}'.format(p=ei)) + + return args
[PerlXmlParser->[depends_on,version]]
Return a list of all the VObject objects in the system.
You can use `self.spec['expat'].prefix.lib` instead of `join_path(self.spec['expat'].prefix, 'lib')`. Same for include.
@@ -206,6 +206,7 @@ func (d *Dispatcher) GetDockerAPICommand(conf *config.VirtualContainerHostConfig dEnv = append(dEnv, fmt.Sprintf("DOCKER_CERT_PATH=%s", abs)) } } else { + log.Infof("") log.Warnf("Unable to find valid client certs") log.Warnf("DOCKER_CERT_PATH must be provided in environment or certificates specified individually via CLI arguments") }
[InspectVCH->[LoadCertificate,Join,Error,Infof,Begin,X509Certificate,Sprintf,VerifyClientCert,PowerState,Warnf,End,IsNil,String,Errorf,IsUnspecifiedIP,ShowVCH,NewKeyPair,Debugf],ShowVCH->[Infof,GetDockerAPICommand,String,Info,WriteFile],GetDockerAPICommand->[Join,Stat,IsDir,Sprintf,Name,Warnf,IsNil,Abs],Sprintf,Join,Infof,Current]
GetDockerAPICommand returns the command to run docker - h.
Is this cosmetic?
@@ -41,6 +41,13 @@ module Engine end def check_track_restrictions!(entity, old_tile, new_tile) + # handle edge cases where the new ends override olds but original paths are not touched + raise GameError, 'New track must override old one' if INVALID_TRACK_UPDATES.any? do |props| + old_tile.name == props[:old] && + new_tile.name == props[:new] && + props[:diff].include?((new_tile.rotation - old_tile.rotation + 6) % 6) + end + super(entity.company? ? entity.owner : entity, old_tile, new_tile) end end
[update_tile_lists->[label,tiles,icons,preprinted,include?,location_name,tile_by_id,id,start_with?,delete],lay_tile->[label,icons,include?,location_name,tile,hex],check_track_restrictions!->[company?,owner],new]
Check if the track restriction is satisfied.
why are these edge cases?
@@ -120,7 +120,9 @@ public class OMBucketCreateRequest extends OMClientRequest { OMMetadataManager metadataManager = ozoneManager.getMetadataManager(); - BucketInfo bucketInfo = getBucketInfoFromRequest(); + CreateBucketRequest createBucketRequest = getOmRequest() + .getCreateBucketRequest(); + BucketInfo bucketInfo = createBucketRequest.getBucketInfo(); String volumeName = bucketInfo.getVolumeName(); String bucketName = bucketInfo.getBucketName();
[OMBucketCreateRequest->[getBeinfo->[getBeinfo,getKeyName,warmUpEncryptedKeys,build,getMetadata,getCipher,setSuite,convert,OMException],getBucketInfoFromRequest->[getBucketInfo,getCreateBucketRequest],addDefaultAcls->[toList,setAcls,getAcls,addAll,collect,inheritDefaultAcls],validateAndUpdateCache->[acquireWriteLock,acquireReadLock,OMBucketCreateResponse,getUserInfo,getFromProtobuf,getVolumeKey,setFlushFuture,incNumBucketCreates,getAclsEnabled,getVolumeName,getMetrics,getBucketInfoFromRequest,of,createErrorOMResponse,toAuditMap,getBucketName,getBucketKey,add,auditLog,incNumBucketCreateFails,debug,error,addDefaultAcls,setCreateBucketResponse,incNumBuckets,OMException,getAuditLogger,buildAuditMessage,build,addCacheEntry,releaseReadLock,get,checkAcls,getMetadataManager,setStatus,releaseWriteLock],preExecute->[getBeinfo,setCreationTime,hasBeinfo,build,getKmsProvider,toBuilder,getBucketInfo,getCreateBucketRequest,now,setBucketInfo,setBeinfo],getLogger]]
Validate and update cache. This method is called when a bucket is already created.
Any reason for removing method, and adding that logic here?
@@ -55,6 +55,8 @@ cov = mne.read_cov(cov_fname) # Handling average file condition = 'Left visual' evoked = mne.read_evokeds(ave_fname, condition=condition, baseline=(None, 0)) +if all([p['active'] for p in evoked.info['projs']]): + evoked.proj = True evoked = mne.pick_channels_evoked(evoked) # We make the window slightly larger than what you'll eventually be interested # in ([-0.05, 0.3]) to avoid edge effects.
[set_data_time_index,pick_channels_evoked,read_evokeds,regularize,print,read_cov,read_forward_solution,data_path,dict,plot,apply_inverse,pick_types,crop,show_view,plot_sparse_source_estimates,add_label,make_inverse_operator,tf_mixed_norm]
A list of all possible noise conditions. Run solver for single - variable .
please fix this hack with a clean PR as suggested in the issue you opened. The rebase and clean this up.
@@ -55,6 +55,8 @@ class Activation(Registrable): # Activation.by_name('relu')() Registrable._registry[Activation] = { "linear": lambda: lambda x: x, # type: ignore + "mish": lambda: lambda x: x * torch.tanh(torch.nn.functional.softplus(x)), + "swish": lambda: lambda x: x * torch.sigmoid(x), "relu": torch.nn.ReLU, "relu6": torch.nn.ReLU6, "elu": torch.nn.ELU,
[No CFG could be retrieved]
This method is called by the network when a new tensor is added to the registry.
You will need to add this `# type: ignore` statement to these lambda functions for our mypy check.
@@ -8,6 +8,9 @@ from django.contrib.sites.models import Site from django.core.exceptions import ValidationError from django.db import transaction +from saleor.order.capture_payment import capture_payments +from saleor.order.interface import OrderPaymentAction + from ..account.models import User from ..core import analytics from ..core.exceptions import AllocationError, InsufficientStock, InsufficientStockData
[_move_fulfillment_lines_to_target_fulfillment->[_get_fulfillment_line],create_replace_order->[_populate_replace_order_fields],process_replace->[_move_lines_to_replace_fulfillment,create_replace_order],approve_fulfillment->[order_fulfilled],_get_fulfillment_line->[_get_fulfillment_line_if_exists],_process_refund->[_calculate_refund_amount],create_return_fulfillment->[_move_lines_to_return_fulfillment,order_returned],create_fulfillments->[_create_fulfillment_lines],create_refund_fulfillment->[__get_shipping_refund_amount,_move_order_lines_to_target_fulfillment,_move_fulfillment_lines_to_target_fulfillment],_move_lines_to_return_fulfillment->[_move_order_lines_to_target_fulfillment,_move_fulfillment_lines_to_target_fulfillment],order_fulfilled->[order_fulfilled],order_captured->[handle_fully_paid_order],order_created->[order_created],_create_fulfillment_lines->[fulfill_order_lines],_move_lines_to_replace_fulfillment->[_move_order_lines_to_target_fulfillment,_move_fulfillment_lines_to_target_fulfillment],automatically_fulfill_digital_lines->[fulfill_order_lines],create_fulfillments_for_returned_products->[__get_shipping_refund_amount,create_return_fulfillment,process_replace],order_confirmed->[order_confirmed]]
Imports a single object from the database. Create a function that creates a new order object if the object is not present.
Absolute instead of relative
@@ -880,9 +880,10 @@ if (classes === null) { } // HINT: if we're going to support specific class definitions this process won't work anymore as it will override the definitions. -exportTypeScriptDeclarations (classes) +exportTypeScriptDeclarations (classes) // we use typescript? transpilePrecisionTests () +transpileDateTimeTests() transpilePythonAsyncToSync ('./python/test/test_async.py', './python/test/test.py') // transpilePrecisionTests ('./js/test/base/functions/test.number.js', './python/test/test_decimal_to_precision.py', './php/test/decimal_to_precision.php')
[transpileJavaScriptToPython3,bright,sep,filter,join,closeSync,includes,regexAll,map,transpilePython3ToPython2,mkdirSync,transpileJavaScriptToPHP,cyan,transpileDerivedExchangeFiles,argv,replace,log,concat,transpileJavaScriptToPythonAndPHP,openSync,code,red,require,Array,trim,replaceInFile,createPHPClass,indexOf,createFolderRecursively,exportTypeScriptDeclarations,createPythonClass,readFileSync,truncateSync,transpileDerivedExchangeClass,transpilePrecisionTests,match,length,push,transpilePythonAsyncToSync,magenta,split,createFolder,keys,writeFileSync,overwriteFile,unCamelCase,transpileDerivedExchangeFile,yellow,exec,deleteFunction,forEach,slice,readdirSync]
This method transpiles the type script declarations of the given class.
> we use typescript? We do, however, our typescript support is currently very rudimentary. It was added for compatibility with the GDAX TT and for those who have requested it. We are going to enhance it further, and will probably make a transition from a loosely-typed to a strictly-typed language in order to actually support more languages, including Rust, Go, Typescript and maybe even C/C++.
@@ -325,6 +325,8 @@ func (c *Configure) Run(clic *cli.Context) (err error) { log.Errorf("Configuring cannot continue - failed to create validator: %s", err) return errors.New("configure failed") } + defer validator.Session.Logout(ctx) + _, err = validator.ValidateTarget(ctx, c.Data) if err != nil { log.Errorf("Configuring cannot continue - target validation failed: %s", err)
[Run->[processCertificates,processParams,copyChangedConf],Flags->[Flags]]
Run starts the vCenter VM This function is called from the guestinfo. go to get the virtual host configuration. This method is called from the VM when it is being created. Status - Sets the update status to true and then rolls back to the previous state.
there is one case that session is created successfully, but still returns error in the NewValidator method. Might also add one defer in that method
@@ -90,7 +90,17 @@ func main() { if err != nil { log.Fatal("join meet error", zap.Error(err)) } - svr, err := server.CreateServer(cfg, api.NewHandler) + + coreAPIServiceBuilder := api.NewHandler + keyvisualServiceBuilder := keyvisual.NewKeyvisualService + + // Creates server. + ctx, cancel := context.WithCancel(context.Background()) + svr, err := server.CreateServer( + ctx, + cfg, + coreAPIServiceBuilder, + keyvisualServiceBuilder) if err != nil { log.Fatal("create server failed", zap.Error(err)) }
[ReplaceGlobals,InitLogger,Warn,Close,PrepareJoinCluster,NewConfig,CreateServer,Info,Exit,Done,SetupLogger,LogPanic,Error,Cause,Notify,PrintPDInfo,InitHTTPClient,EnableHandlingTimeHistogram,PrintConfigCheckMsg,Sync,Background,WithCancel,GetZapLogProperties,String,Parse,Fatal,GetZapLogger,LogPDInfo,Run,Push]
Initialization of the object. exit is a callback that is called when the server exits.
why not use `api.NewHandler` and `keyvisual.NewKeyvisualService` directly?
@@ -104,8 +104,8 @@ function SidebarWithRef( // We are using refs here to refer to common strings, objects, and functions used. // This is because they are needed within `useEffect` calls below (but are not depended for triggering) // We use `useValueRef` for props that might change (user-controlled) + const sideRef = useValueRef(sideProp); const onBeforeOpenRef = useValueRef(onBeforeOpen); - const onAfterCloseRef = useValueRef(onAfterClose); const open = useCallback(() => { if (onBeforeOpenRef.current) {
[No CFG could be retrieved]
Creates a sidebar with a single . Creates a single object that can be used to set the handle for a specific element in the.
Hmm. I think here it actually should be a state? Does the component need to be rerendered when the side changes? It might need to be ignored for animations, but for rendering and component in general this could be a different story.
@@ -378,6 +378,14 @@ class Jetpack { */ public static $plugin_upgrade_lock_key = 'jetpack_upgrade_lock'; + /** + * The highest action hook priority that was used for the plugins_loaded event. + * Used in the `add_configure_hook' method. + * + * @var Integer the priority value. + */ + protected $highest_priority = null; + /** * Holds the singleton instance of this class *
[Jetpack->[reset_saved_auth_state->[reset_saved_auth_state],add_remote_request_handlers->[require_jetpack_authentication],admin_page_load->[disconnect],development_mode_trigger_text->[is_development_mode],is_active_and_not_development_mode->[is_development_mode],check_identity_crisis->[is_development_mode],register->[register],translate_role_to_cap->[translate_role_to_cap],activate_new_modules->[is_development_mode],activate_module->[is_development_mode],get_locale->[guess_locale_from_lang],authenticate_jetpack->[authenticate_jetpack],api_url->[api_url],is_development_mode->[is_development_mode],require_jetpack_authentication->[require_jetpack_authentication],admin_init->[is_development_mode],load_modules->[is_development_mode],get_secrets->[get_secrets],clean_nonces->[clean_nonces],sign_role->[sign_role],delete_secrets->[delete_secrets],generate_secrets->[generate_secrets],plugin_action_links->[is_development_mode],remove_non_jetpack_xmlrpc_methods->[remove_non_jetpack_xmlrpc_methods],xmlrpc_options->[xmlrpc_options],wp_rest_authenticate->[verify_xml_rpc_signature],implode_frontend_css->[is_development_mode],translate_current_user_to_role->[translate_current_user_to_role],build_connect_url->[build_connect_url],initialize_rest_api_registration_connector->[initialize_rest_api_registration_connector],verify_xml_rpc_signature->[verify_xml_rpc_signature],authorize_starting->[do_stats,stat],translate_user_to_role->[translate_user_to_role],alternate_xmlrpc->[alternate_xmlrpc],get_assumed_site_creation_date->[get_assumed_site_creation_date],dashboard_widget_footer->[is_development_mode],xmlrpc_api_url->[xmlrpc_api_url],public_xmlrpc_methods->[public_xmlrpc_methods],verify_json_api_authorization_request->[add_nonce],is_staging_site->[is_staging_site],show_development_mode_notice->[is_development_mode],jetpack_getOptions->[jetpack_getOptions],is_user_connected->[is_user_connected],add_nonce->[add_nonce],wp_dashboard_setup->[is_development_mode],xmlrpc_methods->[xmlrpc_methods],setup_xmlrpc_handlers->[setup_xmlrpc_handlers]]]
A class that represents a single object that can be used to manage a single object. Lock to prevent multiple instances of the upgrade.
Just a nitpick, but what do you think of using a different variable name here? We also have the local `$highest_priority` variable in the `add_configure_hook()` method. The two variables have the same name but represent different things. Maybe something like `$configure_hook_priority`?
@@ -308,6 +308,11 @@ define([ gl.bindFramebuffer(gl.FRAMEBUFFER, null); }; + Framebuffer.prototype._attachTexture = function(context, attachment, texture) { + context.bindFramebuffer(this); + attachTexture(this, attachment, texture); + }; + Framebuffer.prototype._getActiveColorAttachments = function() { return this._activeColorAttachments; };
[No CFG could be retrieved]
The constructor for the Framebuffer class. The ColorFramebuffer prototype.
It would be cleaner just to call `_bind` here, but then you end up re-binding the FBO when you render the shadow casters.
@@ -407,8 +407,10 @@ type MasterConfig struct { // AuditConfig holds information related to auditing capabilities. AuditConfig AuditConfig - // EnableTemplateServiceBroker is a temporary switch which enables TemplateServiceBroker. - EnableTemplateServiceBroker bool + // TemplateServiceBrokerConfig holds information related to the template + // service broker. The broker is enabled if TemplateServiceBrokerConfig is + // non-nil. + TemplateServiceBrokerConfig *TemplateServiceBrokerConfig } // MasterAuthConfig configures authentication options in addition to the standard
[StringKeySet,NewString]
ImageConfig - ImageConfig for a single node.
one of our option questions here was whether this should be nested into a generic "ServerBrokerConfigs" container struct so we can have multiple broker configuration objects in the future (for different brokers). The current thinking is we're more likely to move this config *out* of the master, than introduce new brokers into the master, but I'd like @smarterclayton to confirm that view before we paint ourselves into a corner.
@@ -14,7 +14,11 @@ module Redcarpet @options[:link_attributes]&.each do |attribute, value| link_attributes += %( #{attribute}="#{value}") end - %(<a href="#{link}"#{link_attributes}>#{content}</a>) + if (/^(http(s*):\/\/)/m.match? link) || link.nil? + %(<a href="#{link}"#{link_attributes}>#{content}</a>) + else + %(<a href="//#{link}"#{link_attributes}>#{content}</a>) + end end def header(title, header_number)
[HTMLRouge->[link->[each,match?],slugify->[gsub,sanitize],header->[slugify],include],require]
Creates a link tag with a neccessary content.
Can I ask why this regexp matches multiple `s` chars and uses `m` (dot matching new lines) as a modifier? I also suggest replacing `^` with `\A`
@@ -444,8 +444,13 @@ describe InfoRequest do @ir = info_requests(:naughty_chicken_request) end + after do + Object.send(:remove_const, 'InfoRequest') + load 'app/models/info_request.rb' + end + it "rejects invalid states" do - lambda {@ir.set_described_state("foo")}.should raise_error(ActiveRecord::RecordInvalid) + expect {@ir.set_described_state("foo")}.to raise_error(ActiveRecord::RecordInvalid) end it "accepts core states" do
[apply_filters->[request_list,map],create,let,tag_string,it,to,calculated_state,find_by_incoming_email,with,public_body,info_requests_not_held_count,each,match,set_described_state,context,reload,and_return,date_deadline_extended,public_bodies,parse_raw_email!,create!,should,utc,root,save!,apply_censor_rules_to_binary!,info_request_events,magic_email_for_id,destroy,last,is_batch_request_template,created_at,request_email,title,event_type,id,url_name,find_old_unclassified,user_id,valid?,save,expand_path,all,find,new,recent_requests,similar_requests,before,apply_filters,magic_email,mock_model,now,email_subject_followup,gsub,get_random_old_unclassified,count_old_unclassified,awaiting_description,twice,gsub!,response_event,build,eq,apply_censor_rules_to_text!,log_event,idhash,raise_error,update_attribute,days,prominence,describe,mock,update_attributes,incoming_email,join,delete_all,first,info_requests_successful_count,described_state,name,move_to_public_body,should_receive,should_not,require,guess_by_incoming_email,count,class_eval,dirname,send,fully_destroy,info_requests,incoming_messages,and_call_original,users]
it is nasty but it s not overdue on due date after 20 working days has not overdue if it s had the deadline extended.
Prefer double-quoted strings unless you need single quotes to avoid extra backslashes for escaping.
@@ -198,10 +198,10 @@ function record_sensor_data($device, $all_sensors) // FIXME also warn when crossing WARN level! if ($sensor['sensor_limit_low'] != '' && $prev_sensor_value > $sensor['sensor_limit_low'] && $sensor_value < $sensor['sensor_limit_low'] && $sensor['sensor_alert'] == 1) { echo 'Alerting for '.$device['hostname'].' '.$sensor['sensor_descr']."\n"; - log_event("$class {$sensor['sensor_descr']} under threshold: $sensor_value $unit (< {$sensor['sensor_limit_low']} $unit)", $device, $class, 4, $sensor['sensor_id']); + log_event("$class under threshold: $sensor_value $unit (< {$sensor['sensor_limit_low']} $unit)", $device, $class, 4, $sensor['sensor_id']); } elseif ($sensor['sensor_limit'] != '' && $prev_sensor_value < $sensor['sensor_limit'] && $sensor_value > $sensor['sensor_limit'] && $sensor['sensor_alert'] == 1) { echo 'Alerting for '.$device['hostname'].' '.$sensor['sensor_descr']."\n"; - log_event("$class {$sensor['sensor_descr']} above threshold: $sensor_value $unit (> {$sensor['sensor_limit']} $unit)", $device, $class, 4, $sensor['sensor_id']); + log_event("$class above threshold: $sensor_value $unit (> {$sensor['sensor_limit']} $unit)", $device, $class, 4, $sensor['sensor_id']); } if ($sensor['sensor_class'] == 'state' && $prev_sensor_value != $sensor_value) { $trans = array_column(
[record_sensor_data->[addDataset],poll_device->[getAttribs,getMessage,addDataset,getTraceAsString],poll_mib_def->[addDataset]]
Record the sensor data for a given device. Private function to update the data in the system if it is not in the system. check if the sensor has changed.
Why did you remove this?
@@ -94,6 +94,11 @@ func (proc *Process) getDetails(envPredicate func(string) bool) error { if err := proc.Cpu.Get(proc.Pid); err != nil { return fmt.Errorf("error getting process cpu time for pid=%d: %v", proc.Pid, err) } + + proc.IO = sigar.ProcIO{} + if err := proc.IO.Get(proc.Pid); err != nil { + return fmt.Errorf("error getting process io for pid=%d: %v", proc.Pid, err) + } if proc.CmdLine == "" { args := sigar.ProcArgs{}
[GetProcStats->[getDetails,getProcessEvent,MatchProcess]]
getDetails returns details about the process.
This statement looks to be unnecessary.
@@ -83,5 +83,10 @@ namespace NServiceBus.Settings /// Setup property injection for the given type based on convention. /// </summary> void ApplyTo<T>(IComponentConfig config); + + /// <summary> + /// Setup property injection for the given type based on convention. + /// </summary> + void ApplyTo(Type componentType, IComponentConfig config); } }
[No CFG could be retrieved]
Apply this component to the given config.
why r we adding new stuff to support property injection. i thought we were moving away from that?
@@ -12,14 +12,14 @@ namespace System.Xml.Linq public class XDocumentType : XNode { private string _name; - private string _publicId; - private string _systemId; + private string? _publicId; + private string? _systemId; private string _internalSubset; /// <summary> /// Initializes an empty instance of the <see cref="XDocumentType"/> class. /// </summary> - public XDocumentType(string name, string publicId, string systemId, string internalSubset) + public XDocumentType(string name, string? publicId, string? systemId, string internalSubset) { _name = XmlConvert.VerifyName(name); _publicId = publicId;
[XDocumentType->[GetDeepHashCode->[GetHashCode],Task->[IsCancellationRequested,WriteDocTypeAsync,FromCanceled,nameof],DeepEquals->[_publicId,_name,SystemId,_internalSubset],WriteTo->[WriteDocType,nameof],Value,nameof,NotifyChanged,_name,VerifyName,GetAttribute,Name,_publicId,_internalSubset,DocumentType,Read,NotifyChanging,_systemId]]
Creates an instance of the XDocumentType class from an XML node object. Gets or sets the name of the object type that is associated with this object.
most likely `internalSubset` should be nullable (looking at other classes and no guarding against null)
@@ -45,7 +45,7 @@ namespace System.Collections.Generic internal interface IArraySortHelper<TKey, TValue> { - void Sort(Span<TKey> keys, Span<TValue> values, IComparer<TKey>? comparer); + void Sort<TComparer>(Span<TKey> keys, Span<TValue> values, TComparer comparer) where TComparer : IComparer<TKey>?; } [TypeDependency("System.Collections.Generic.GenericArraySortHelper`2")]
[ArraySortHelper->[CreateArraySortHelper->[Allocate,Instantiate,IsAssignableFrom],CreateArraySortHelper]]
Sort the keys and values in the order they appear in the key span.
I think it would be better for TComparer to be on IArraySortHelper. Generic virtual methods are full of (performance) problems.
@@ -67,6 +67,9 @@ import <%= packageName %>.AbstractNeo4jIT; <%_ if (cacheProvider === 'redis') { _%> import <%= packageName %>.RedisTestContainerExtension; <%_ } _%> +<%_ if (reactive && ['mysql', 'postgresql', 'mssql'].includes(prodDatabaseType)) { _%> +import <%= packageName %>.ReactiveSqlTestContainerExtension; +<%_ } _%> import <%= packageName %>.<%= mainClass %>; <%_ if (authenticationType === 'uaa') { _%> import <%= packageName %>.config.SecurityBeanOverrideConfiguration;
[No CFG could be retrieved]
Creates a unique identifier for the entity. check if we have a sequence number and import it if not import it.
I thought mariadb was a supported database for R2DBC. Am I missing something?
@@ -14,8 +14,6 @@ type kafkaConfig struct { TLS *outputs.TLSConfig `config:"tls"` Timeout time.Duration `config:"timeout" validate:"min=1"` Worker int `config:"worker" validate:"min=1"` - UseType bool `config:"use_type"` - Topic string `config:"topic"` KeepAlive time.Duration `config:"keep_alive" validate:"min=0"` MaxMessageBytes *int `config:"max_message_bytes" validate:"min=1"` RequiredACKs *int `config:"required_acks" validate:"min=-1"`
[Validate->[ToLower,New,Errorf]]
kafka import imports and validates a single object. returns error if is not nil.
Topic still exist, don't we need it here?
@@ -40,9 +40,9 @@ public final class MockOnlyOneTicketRegistry implements TicketRegistry { } @Override - public boolean deleteTicket(final String ticketId) { + public int deleteTicket(final String ticketId) { this.ticket = null; - return false; + return 0; } @Override
[MockOnlyOneTicketRegistry->[updateTicket->[addTicket]]]
Delete a ticket.
I guess it's not a big deal with a mocked ticket registry. But shouldn't it return 1?
@@ -7,6 +7,17 @@ class ArticlesController < ApplicationController before_action :set_cache_control_headers, only: %i[feed] after_action :verify_authorized + FEED_ALLOWED_TAGS = %w[ + strong em a table tbody thead tfoot th tr td col colgroup del p h1 h2 h3 h4 + h5 h6 blockquote iframe time div span i em u b ul ol li dd dl dt q code pre + img sup cite center br small + ].freeze + + FEED_ALLOWED_ATTRIBUTES = %w[ + href strong em class ref rel src title alt colspan height width size rowspan + span value start data-conversation data-lang id + ].freeze + def feed skip_authorization
[ArticlesController->[preview->[new],update->[update]]]
checks if a user or organization has a tag or a tag and if so renders a hidden.
these were buried in the RSS builder
@@ -129,6 +129,12 @@ class DictionaryAgent(Agent): dictionary.add_argument( '--dict-lower', default=DictionaryAgent.default_lower, type='bool', help='Whether or not to lowercase all text seen.') + dictionary.add_argument( + '--bpe-num-symbols', default=30000, type=int, + help='Number of BPE symbols. Recommended between 30000 and 40000') + dictionary.add_argument( + '--bpe-codecs-file', + help='Filename for the BPE codecs. Defaults to dictfile.codecs.') except argparse.ArgumentError: # already added pass
[find_ngrams->[find_ngrams],DictionaryAgent->[keys->[keys],load->[unescape],tokenize->[find_ngrams],shutdown->[save],save->[escape],txt2vec->[tokenize],max_freq->[keys],act->[add_to_dict,tokenize],span_tokenize->[spacy_span_tokenize],sort->[resize_to_max,remove_tail]]]
Adds command line arguments to the dictionary. Add an argument to the dictionary if the nltk option is present.
how do you turn off bpe?
@@ -104,6 +104,7 @@ public final class InternalHiveConnectorFactory binder.bind(PageSorter.class).toInstance(context.getPageSorter()); binder.bind(CatalogName.class).toInstance(new CatalogName(catalogName)); }, + binder -> newSetBinder(binder, EventListener.class), module); Injector injector = app
[InternalHiveConnectorFactory->[createConnector->[createConnector]]]
Creates a new Hive connector. Creates a new instance of HiveConnector with no dependencies.
Agreed, we should just bind an empty set here.
@@ -12,10 +12,10 @@ locals: <%= tag.div class: classes do %> <div class="spinner-button__content"> <%= content %> - <span class="spinner-button__spinner" aria-hidden="true"> - <span class="spinner-button__spinner-dot"></span> - <span class="spinner-button__spinner-dot"></span> - <span class="spinner-button__spinner-dot"></span> + <span class="spinner-dots spinner-dots--centered text-white" aria-hidden="true"> + <span class="spinner-dots__dot"></span> + <span class="spinner-dots__dot"></span> + <span class="spinner-dots__dot"></span> </span> </div> <% if local_assigns[:action_message] %>
[No CFG could be retrieved]
Displays a menu item that can be displayed when the user clicks on a block of code.
Ideally this should be its own component, like a ViewComponent counterpart to the React component introduced in the branch. Since this is already growing quite large, might save that for a follow-up.
@@ -224,7 +224,7 @@ module.exports = class btctradeua extends Exchange { const year = parts[2]; let hms = parts[4]; const hmsLength = hms.length; - if (hmsLength === 7) { + if ((hmsLength === 7) || (hmsLength === 4)) { hms = '0' + hms; } if (day.length === 1) {
[No CFG could be retrieved]
Get the string representation of a Parse a single object.
Unnecessary declaration of `hms.length`, use directly.
@@ -55,7 +55,8 @@ PLATFORM_NAMES_TO_CONSTANTS = { } -version_re = re.compile(r"""(?P<major>\d+) # major (x in x.y) +version_re = re.compile( + r"""(?P<major>\d+) # major (x in x.y) \.(?P<minor1>\d+) # minor1 (y in x.y) \.?(?P<minor2>\d+|\*)? # minor2 (z in x.y.z) \.?(?P<minor3>\d+|\*)? # minor3 (w in x.y.z.w)
[user_media_path->[getattr,format,upper,join],log_exception->[getLogger,exc_info,exception],get_cdn_url->[user_media_url,str,format,urlencode,join],user_media_url->[getattr,format,upper],getconn->[connect],log_configure->[dictConfig],QueuePool,values,compile]
Creates a function to override the default values of the in settings. Get the URL of the file that has the given hash.
nit: the formatting for this is weird now (not that it matters after it's compiled)