patch
stringlengths
18
160k
callgraph
stringlengths
4
179k
summary
stringlengths
4
947
msg
stringlengths
6
3.42k
@@ -53,7 +53,8 @@ class Openssl(Package): version('1.0.1h', '8d6d684a9430d5cc98a62a5d8fbda8cf') depends_on("zlib") - # Also requires make and perl + depends_on("perl", type='build') + # Also requires make parallel = False
[Openssl->[install->[join_path,satisfies,append,Executable,pop,filter_file,config,make],handle_fetch_error->[warn],depends_on,version]]
Called when a fetch fails.
I think we can get rid of this "Also requires make" message. So do most of the packages in Spack.
@@ -303,12 +303,12 @@ func (h *balanceHotRegionsScheduler) balanceByPeer(cluster opt.Cluster, storesSt continue } - if isRegionUnhealthy(srcRegion) { + if !opt.IsRegionHealthyAllowPending(cluster, srcRegion) { schedulerCounter.WithLabelValues(h.GetName(), "unhealthy-replica").Inc() continue } - if len(srcRegion.GetPeers()) != cluster.GetMaxReplicas() { + if !opt.IsRegionReplicated(cluster, srcRegion) { log.Debug("region has abnormal replica count", zap.String("scheduler", h.GetName()), zap.Uint64("region-id", srcRegion.GetID())) schedulerCounter.WithLabelValues(h.GetName(), "abnormal-replica").Inc() continue
[balanceByLeader->[allowBalanceLeader,GetName],Schedule->[GetName],balanceByPeer->[allowBalanceRegion,GetName],balanceHotReadRegions->[GetName],balanceHotWriteRegions->[GetName]]
balanceByPeer attempts to balance a region by a peer. ExcludedFilter returns a slice of the source region and the destination region that should be excluded from.
it is too long
@@ -244,11 +244,11 @@ def _template_read_video(video_object, s=0, e=None): video_frames = torch.empty(0) frames = [] video_pts = [] - for frame in itertools.takewhile(lambda x: x['pts'] <= e, video_object): - if frame['pts'] < s: + for frame in itertools.takewhile(lambda x: x["pts"] <= e, video_object): + if frame["pts"] < s: continue - frames.append(frame['data']) - video_pts.append(frame['pts']) + frames.append(frame["data"]) + video_pts.append(frame["pts"]) if len(frames) > 0: video_frames = torch.stack(frames, 0)
[TestVideo->[test_video_reading_fn->[_template_read_video,_decode_frames_by_av_module]],_decode_frames_by_av_module->[_read_from_stream,_fraction_to_tensor]]
Read video and audio frames from template.
Why these `' -> "` changes?
@@ -886,7 +886,8 @@ bool MapgenBasic::generateCavernsNoise(s16 max_stone_y) void MapgenBasic::generateDungeons(s16 max_stone_y) { - if (max_stone_y < node_min.Y) + if (node_min.Y > max_stone_y || node_min.Y > dungeon_ymax || + node_max.Y < dungeon_ymin) return; u16 num_dungeons = std::fmax(std::floor(
[readParams->[readParams],getSpawnRangeMax->[calcMapgenEdges],spreadLight->[lightSpread],updateHeightmap->[findGroundLevel],writeParams->[writeParams]]
Generate dungeons for a given max_stone_y. at midpoint of the map.
Order of first condition swapped for clarity and consistency with other mapgen functions.
@@ -895,11 +895,9 @@ func (h *Helper) updateConfig(configDir string, opt *StartOptions) error { return err } glog.V(5).Infof("cgroup driver from Docker: %s", cgroupDriver) - if nodeCfg.KubeletArguments == nil { - nodeCfg.KubeletArguments = configapi.ExtendedArguments{} - } nodeCfg.KubeletArguments["cgroup-driver"] = []string{cgroupDriver} } + nodeCfg.KubeletArguments["fail-swap-on"] = []string{"false"} cfgBytes, err = configapilatest.WriteYAML(nodeCfg) if err != nil {
[updateConfig->[ServerVersion,GetConfigFromLocalDir,GetNodeConfigFromLocalDir],Start->[Start],StartNode->[Start]]
updateConfig updates admission config with options This function is called to initialize the config object aggregator - front - proxy is a command that creates the necessary files for the aggregator This function uploads the configuration file to the container and then creates the necessary files for the service UploadFileToContainer uploads config file to container.
this appears to have broken backwards compatibility for cluster up.
@@ -9,12 +9,16 @@ from itertools import cycle import copy import numpy as np from scipy import linalg +import mne # XXX : don't import pylab here or you will break the doc from .fiff.pick import channel_type, pick_types from .fiff.proj import make_projector, activate_proj +COLORS = ['b', 'g', 'r', 'c', 'm', 'y', 'k', '#473C8B', '#458B74', + '#CD7F32', '#FF4040', '#ADFF2F', '#8E2323', '#FF1493'] + def plot_topo(evoked, layout): """Plot 2D topographies
[plot_cov->[tight_layout,sqrt,enumerate,figure,imshow,show,svd,make_projector,ylabel,dot,len,semilogy,subplot,title,subplots_adjust,xlabel,index,deepcopy,pick_types],plot_topo->[figure,plot,yticks,xticks,axes,index],plot_sparse_source_estimates->[sum,enumerate,figure,quiver3d,show,to_rgb,cycle,range,ylabel,clf,stcs,plot,unique,len,isinstance,concatenate,title,ColorConverter,next,xlabel,triangular_mesh],plot_source_estimate->[SurfaceViewer->[update_plot->[user_defined,astype,int,triangular_mesh_source,text,SmoothPolyDataFilter,len,surface,set,colorbar],__init__->[searchsorted,super,where],Range,SceneEditor,Item,on_trait_change,Group,Instance,View],SurfaceViewer,configure_traits],plot_evoked->[tight_layout,channel_type,show,ylim,activate_proj,make_projector,range,ylabel,clf,dot,plot,append,len,zip,title,subplot,subplots_adjust,xlabel,xlim]]
Plot 2D topographies of a .
don't use absolute import from .baseline import rescale
@@ -1561,13 +1561,9 @@ void GenericCAO::processMessage(const std::string &data) } } else if (cmd == GENERIC_CMD_SPAWN_INFANT) { u16 child_id = readU16(is); - u8 type = readU8(is); + //u8 type = readU8(is); maybe this will be useful later - if (GenericCAO *childobj = m_env->getGenericCAO(child_id)) { - childobj->processInitData(deSerializeLongString(is)); - } else { - m_env->addActiveObject(child_id, type, deSerializeLongString(is)); - } + addAttachmentChild(child_id); } else { warningstream << FUNCTION_NAME << ": unknown command or outdated client \""
[step->[getSceneNode,update,removeFromScene,translate,addToScene,setAttachments,updateNodePos,getParent,getPosition],updateLight->[getParent],processInitData->[updateNodePos,processMessage,init], ClientActiveObject->[getType],processMessage->[update,processInitData,init,updateTexturePos,updateTextures,updateAnimation,updateNodePos,updateAttachments,getParent,updateBonePosition,updateAnimationSpeed],addToScene->[updateNodePos,getSceneNode],updateNodePos->[getSceneNode,getParent], removeFromScene->[removeFromScene],updateAttachments->[getAnimatedMeshSceneNode,getSceneNode,getParent], ClientActiveObject->[getType],directReportPunch->[updateTextures]]
Process a message received from the server. read a node s attributes read the frame data from the input stream read all the data for a specific animation V3 - level API This function is called when a command is defined in a response to a command that is defined read a child object from the input stream.
since this <i>does</i> exist in the cmd, I'd prefer to read it and use `(void)` to silence the unused warning
@@ -226,7 +226,7 @@ public class DeleteActionsBean implements DeleteActions, Serializable { if (docs == null) { return null; } - TrashInfo info = getTrashService().getTrashInfo(docs, currentUser, false, false); + TrashInfo info = (TrashInfo) getTrashService().getTrashInfo(docs, currentUser, false, false); DocumentModel targetContext = getTrashService().getAboveDocument(navigationContext.getCurrentDocument(), info.rootPaths);
[DeleteActionsBean->[checkDeletePermOnParents->[checkDeletePermOnParents],restoreCurrentDocument->[undeleteSelection],purgeSelection->[purgeSelection],deleteSelection->[deleteSelection],restoreActionDisplay->[isTrashManagementEnabled,getCanRestoreCurrentDoc],deleteSelectionSections->[deleteSelection],isTrashManagementEnabled->[isTrashManagementEnabled],getCanDelete->[getCanDelete],undeleteSelection->[undeleteSelection]]]
This method is called when the user has selected a document in the list. can not delete proxies.
We should use new `TrashInfo`.
@@ -145,8 +145,6 @@ type Config struct { Dashboard DashboardConfig `toml:"dashboard" json:"dashboard"` ReplicationMode ReplicationModeConfig `toml:"replication-mode" json:"replication-mode"` - // EnableRedactLog indicates that whether redact log, 0 is disable. 1 is enable. - EnableRedactLog bool `toml:"enable-redact-log" json:"enable-redact-log"` } // NewConfig creates a new config.
[MigrateDeprecatedFlags->[migrateConfigurationMap],Parse->[Parse],RewriteFile->[GetConfigFile],IsDefined->[IsDefined],Adjust->[CheckUndecoded,Parse,IsDefined,Validate,Child],parseDeprecatedFlag->[IsDefined],adjustLog->[IsDefined],adjust->[Child,Validate,adjust,IsDefined],Parse]
NewConfig creates a new configuration object for the n - node label property. Adds command line options for the peer traffic forwarding.
Is this configuration compatible with the previous one?
@@ -4016,6 +4016,18 @@ void ByteCodeGenerator::StartEmitFunction(ParseNode *pnodeFnc) this->EnsureLetConstScopeSlots(pnodeFnc->sxFnc.pnodeBodyScope, funcInfo); } } + + if (!paramScope->GetCanMergeWithBodyScope() && bodyScope->GetScopeSlotCount() == 0 && !bodyScope->GetHasOwnLocalInClosure()) + { + // When we have split scope the body scope may be wrongly marked as must instantiate even though the capture occurred + // in param scope. This check is to make sure if no capture occurs in body scope make in not must instantiate. + bodyScope->SetMustInstantiate(false); + } + else + { + bodyScope->SetMustInstantiate(funcInfo->frameObjRegister != Js::Constants::NoRegister || funcInfo->frameSlotsRegister != Js::Constants::NoRegister); + } + paramScope->SetMustInstantiate(!paramScope->GetCanMergeWithBodyScope()); } else {
[No CFG could be retrieved]
Private helper functions Checks if symbol is in module scope or not.
so what happens in the case of eval? you could split scope and there are no variable in the body scope. Also why there is different condition to change the mustinstantiate why not do it on line 3807
@@ -446,7 +446,7 @@ def examples_wordcount_minimal(renames): # [END examples_wordcount_minimal_count] # [START examples_wordcount_minimal_map] - | beam.Map(lambda word_count: '%s: %s' % (word_count[0], word_count[1])) + | beam.MapTuple(lambda word, count: '%s: %s' % (word, count)) # [END examples_wordcount_minimal_map] # [START examples_wordcount_minimal_write]
[examples_wordcount_debugging->[FilterTextFn,RenameFiles],pipeline_monitoring->[CountWords->[expand->[FormatCountsFn,ExtractWordsFn]],CountWords,RenameFiles],examples_ptransforms_templated->[MySumFn,RenameFiles],model_textio_compressed->[RenameFiles],accessing_valueprovider_info_after_run->[LogValueProvidersFn],examples_wordcount_minimal->[RenameFiles],construct_pipeline->[ReverseWords,RenameFiles],pipeline_logging->[ExtractWordsFn],examples_wordcount_wordcount->[FormatAsTextFn,CountWords,RenameFiles],examples_wordcount_templated->[RenameFiles],model_custom_source->[ReadFromCountingSource,CountingSource],model_composite_transform_example->[CountWords],model_textio->[RenameFiles],ReadFromCountingSource->[expand->[_CountingSource]],model_custom_sink->[SimpleKVSink,WriteToKVSink],model_multiple_pcollections_partition->[partition_fn->[get_percentile]]]
Examples. This example shows how to read a single word from a Kinglear file and missing - node - type - unknown - map - noindex - map - noin.
I saw such pattern in quite a few places so this may be a helpful syntactic sugar, as long as it does not make things difficult for us. I wouldn't rush this PR to 2.15.0 if we need more time to polish things.
@@ -2453,7 +2453,7 @@ public class Jenkins extends AbstractCIBase implements DirectlyModifiableTopLeve @CheckForNull public String getConfiguredRootUrl() { JenkinsLocationConfiguration config = JenkinsLocationConfiguration.get(); - return config != null ? config.getUrl() : null; + return config.getUrl(); } /**
[Jenkins->[getUser->[get],_cleanUpShutdownTcpSlaveAgent->[add],setNumExecutors->[updateComputerList],getPlugin->[getPlugin],getCategorizedManagementLinks->[all,add],getViewActions->[getActions],getJDK->[getJDKs,get],setViews->[addView],getCloud->[getByName],getStaplerFallback->[getPrimaryView],getStoredVersion->[get],getViews->[getViews],doDoFingerprintCheck->[isUseCrumbs],deleteView->[deleteView],getLabel->[get],_cleanUpInterruptReloadThread->[add],doConfigSubmit->[save,updateComputerList],CloudList->[onModified->[onModified]],doCheckDisplayName->[isNameUnique,isDisplayNameUnique],_cleanUpPersistQueue->[save,add],getLabelAtom->[get],setBuildsAndWorkspacesDir->[isDefaultWorkspaceDir,isDefaultBuildDir],reload->[loadTasks,save,reload,executeReactor],doConfigExecutorsSubmit->[all,updateComputerList],DescriptorImpl->[getDynamic->[getDescriptor],DescriptorImpl],checkRawBuildsDir->[expandVariablesForDirectory],_cleanUpShutdownThreadPoolForLoad->[add],isDisplayNameUnique->[getDisplayName],_cleanUpRunTerminators->[onTaskFailed->[getDisplayName],onTaskCompleted->[getDisplayName],onTaskStarted->[getDisplayName],add],getJobNames->[getFullName,add],doChildrenContextMenu->[add,getViews,getDisplayName],doLogout->[doLogout],getActiveInstance->[get],getNode->[getNode],copy->[copy],shouldShowStackTrace->[getName],updateNode->[updateNode],doSubmitDescription->[doSubmitDescription],doCheckURIEncoding->[doCheckURIEncoding],getItem->[getItem,get],doViewExistsCheck->[getView],getUnprotectedRootActions->[getActions,add],setAgentProtocols->[add],disableSecurity->[setSecurityRealm],onViewRenamed->[onViewRenamed],getDescriptorByName->[getDescriptor],loadConfig->[getConfigFile],getRootUrl->[get],refreshExtensions->[getInstance,add,getExtensionList],getRootPath->[getRootDir],getView->[getView],putItem->[get],_cleanUpShutdownTimer->[add],_cleanUpDisconnectComputers->[add],getAllThreadDumps->[get,getComputers],createProject->[createProject,getDescriptor],MasterComputer->[doConfigSubmit->[doConfigExecutorsSubmit],hasPermission->[hasPermission],get],createProjectFromXML->[createProjectFromXML],getAgentProtocols->[add],doScript->[getView,getACL],_cleanUpReleaseAllLoggers->[add],isRootUrlSecure->[getRootUrl],EnforceSlaveAgentPortAdministrativeMonitor->[doAct->[forceSetSlaveAgentPort,getExpectedPort],isActivated->[get,getSlaveAgentPortInitialValue],getExpectedPort->[getSlaveAgentPortInitialValue]],setSecurityRealm->[get],getItems->[getItems,add],doCheckViewName->[getView,checkGoodName],removeNode->[removeNode],getSelfLabel->[getLabelAtom],fireBeforeShutdown->[all,add],doSimulateOutOfMemory->[add],restartableLifecycle->[get],expandVariablesForDirectory->[expandVariablesForDirectory,getFullName],_getFingerprint->[get],getManagementLinks->[all],addView->[addView],getPlugins->[getPlugin,getPlugins,add],save->[getConfigFile],getPrimaryView->[getPrimaryView],makeSearchIndex->[get->[getView],makeSearchIndex,add],getNodes->[getNodes],lookup->[get,getInstanceOrNull],getLegacyInstanceId->[getSecretKey],saveQuietly->[save],trimLabels->[trimLabels],getLifecycle->[get],getInstanceOrNull->[getInstance],executeReactor->[containsLinkageError->[containsLinkageError],runTask->[runTask]],setNodes->[setNodes],loadTasks->[run->[setSecurityRealm,getExtensionList,getNodes,setNodes,remove,add,loadConfig],add],remove->[remove],getDescriptorOrDie->[getDescriptor],getLabelAtoms->[add],getItemByFullName->[getItemByFullName,getItem],doCreateView->[addView],getExtensionList->[get,getExtensionList],getLabels->[add],restart->[restartableLifecycle],isNameUnique->[getItem],getWorkspaceFor->[all],_cleanUpShutdownPluginManager->[add],getRootDirFor->[getRootDirFor,getRootDir],canDelete->[canDelete],getInstance->[getInstanceOrNull],getFingerprint->[get],getAuthentication->[getAuthentication2],doScriptText->[getView,getACL],getDynamic->[getActions],_cleanUpPluginServletFilters->[cleanUp,add],_cleanUpShutdownTriggers->[add],addNode->[addNode],updateNewComputer->[updateNewComputer],getTopLevelItemNames->[add],doQuietDown->[doQuietDown,isQuietingDown],safeRestart->[restartableLifecycle],updateComputerList->[updateComputerList],performRenameMigration->[trimLabels],rebuildDependencyGraphAsync->[get,rebuildDependencyGraph],getConfiguredRootUrl->[get],_cleanUpAwaitDisconnects->[get,add],readResolve->[getSlaveAgentPortInitialValue],getName,get]]
Get the root URL of the Jenkins installation.
This can never be null because `ExtensionList#getInstance` throws `IllegalStateException` if the class cannot be found.
@@ -0,0 +1,17 @@ +package com.metamx.druid.indexer.data; + +import java.nio.ByteBuffer; + +import com.fasterxml.jackson.annotation.JsonSubTypes; +import com.fasterxml.jackson.annotation.JsonTypeInfo; + +/** + * @author jan.rudert + */ +@JsonTypeInfo(use = JsonTypeInfo.Id.NAME, property = "type", defaultImpl = StringInputRowParser.class) +@JsonSubTypes(value = { + @JsonSubTypes.Type(name = "protobuf", value = ProtoBufInputRowParser.class), + @JsonSubTypes.Type(name = "string", value = StringInputRowParser.class) +}) +public interface ByteBufferInputRowParser extends InputRowParser<ByteBuffer> { +}
[No CFG could be retrieved]
No Summary Found.
I am generally against `@author` tags. In general, the files needs to be maintainable by everyone who works on a project and as changes are made to a file, it's often the case that an author will exist as a tag on the file, but if you actually go and blame/annotate the file, you will see that it was completely changed by someone else and the author isn't necessarily even aware of what its current state is. I won't make you remove them though, if you really want them :)
@@ -55,6 +55,7 @@ class NodeCounts(Model): 'running': {'required': True}, 'starting': {'required': True}, 'start_task_failed': {'required': True}, + 'leaving_pool': {'required': True}, 'unknown': {'required': True}, 'unusable': {'required': True}, 'waiting_for_start_task': {'required': True},
[NodeCounts->[__init__->[super]]]
The number of nodes in the starting state. Node count information for a specific node.
In order for this to be not breaking, this model has to be used only as output (i.e. from Azure to customer). Please confirm that no operation is taking this as input.
@@ -98,11 +98,14 @@ RSpec.describe "Using the editor", type: :system do describe "using v2 editor", js: true do it "fill out form with rich content and click publish" do visit "/new" - fill_in "article-form-title", with: "This is a test" + fill_in "article-form-title", with: "This is a <span> test" fill_in "tag-input", with: "What, Yo" fill_in "article_body_markdown", with: "Hello" find("button", text: /\APublish\z/).click + + expect(page).to have_xpath("//div[@class='crayons-article__header__meta']//h1") expect(page).to have_text("Hello") + expect(page).not_to have_text("</span>") expect(page).to have_link("#what", href: "/t/what") end end
[fill_markdown_with->[visit,within,fill_in],read_from_file->[dirname,read,join],visit,create,find,let,describe,join,it,to,read_from_file,before,have_text,and,fill_markdown_with,require,click,include,dirname,have_link,execute_script,read,each,gsub,update!,have_css,fill_in,within,context,gsub!,sign_in,after,evaluate_script]
describe v2 editor.
Do we need a similar test for the v1 editor or is this not an issue there?
@@ -1,4 +1,4 @@ -#!/usr/bin/env python +#!/usr/bin/env python3 """ Copyright (c) 2019 Intel Corporation Licensed under the Apache License, Version 2.0 (the "License");
[draw_poses->[astype,edge,circle,tuple,range,line,array],Plotter3d->[_plot_edges->[line,astype,tuple,reshape,dot],plot->[_plot_edges,_draw_axes,_get_rotation,len,fill],_draw_axes->[line,astype,tuple,dot],mouse_callback->[min,max],_get_rotation->[cos,sin,array],__init__->[range,float32,append,array],array],array]
Creates a new object with the properties of the given object. Plots the n - dimensional grids of the n - dimensional grids.
`human_pose_estimation_3d_demo`'s modules are not supposed to be run separately. I suggest removing the line for all its modules.
@@ -56,6 +56,12 @@ func resourceAwsEbsVolume() *schema.Resource { ForceNew: true, ValidateFunc: validateArn, }, + "multi_attach": { + Type: schema.TypeBool, + Optional: true, + Computed: true, + ForceNew: true, + }, "size": { Type: schema.TypeInt, Optional: true,
[GetChange,IgnoreAws,Message,BoolValue,Ec2KeyValueTags,ModifyVolume,Set,Code,NonRetryableError,Ec2UpdateTags,Int64Value,GetOk,DescribeVolumes,CreateVolume,HasChange,Errorf,SetId,RetryableError,Bool,IgnoreConfig,Id,Int64,Get,Map,Printf,DeleteVolume,StringValue,Println,Sprintf,WaitForState,String,Retry]
The resource schema. ResourceConstructor for the EBS Volume ec2. CreateVolumeInput.
Nit: This attribute here and in the data source should be named `multi_attach_enabled` to match the EC2 API.
@@ -104,7 +104,8 @@ func (d *Delegate) ServicesForSpec(jb job.Job) ([]job.Service, error) { db: d.db, pipelineORM: d.pipelineORM, job: jb, - mbLogs: utils.NewMailbox(50), + mbOracleRequests: utils.NewMailbox(50), + mbOracleCancelRequests: utils.NewMailbox(50), minIncomingConfirmations: uint64(minIncomingConfirmations), requesters: concreteSpec.Requesters, minContractPayment: concreteSpec.MinContractPayment,
[ServicesForSpec->[ToInt,Wrapf,With,MinIncomingConfirmations,NewMailbox,Client,NewOperator,String,Errorf,Address,Get,LogBroadcaster,Config,Named],handleCancelOracleRequest->[Errorw,DefaultQueryCtx,WithContext,String,MarkConsumed,LoadAndDelete],handleOracleRequest->[DefaultQueryCtx,Done,Add,MinimumContractPayment,ToStrings,NewRun,ValueOrZero,Errorw,Link,Infow,Cmp,CombinedContext,LoadOrStore,MarkConsumed,allowRequester,Run,WithContext,Err,Sprintf,Background,String,NewVarsFrom,Warnw],Start->[StartOnce,Done,ExternalIDEncodeStringToTopic,Topic,ExternalIDEncodeBytesToTopic,run,Address,Register,Add],Close->[StopOnce,Range,Wait],HandleLog->[Deliver,Error],run->[Notify,Done,handleReceivedLogs],handleReceivedLogs->[ExternalIDEncodeStringToTopic,Error,handleCancelOracleRequest,DecodedLog,Errorw,RawLog,ValueOf,Debugw,DefaultQueryCtx,WithContext,ExternalIDEncodeBytesToTopic,IsNil,handleOracleRequest,Errorf,Warnf,WasAlreadyConsumed,Retrieve],Sprintf,Hex]
ServicesForSpec returns a list of services that can be started for the given job. Add a service to the list of services that are not yet registered.
I think 50 is way too low, we recently increased the VRF to 100k and could do the same here. Imagine we replay a few thousand blocks on matic, its quite possible you could quickly queue up a few thousand requests.
@@ -279,12 +279,12 @@ public class LibvirtComputingResource extends ServerResourceBase implements Serv @Override public ExecutionResult executeInVR(final String routerIp, final String script, final String args) { - return executeInVR(routerIp, script, args, _timeout / 1000); + return executeInVR(routerIp, script, args, _timeout); } @Override - public ExecutionResult executeInVR(final String routerIp, final String script, final String args, final int timeout) { - final Script command = new Script(_routerProxyPath, timeout * 1000, s_logger); + public ExecutionResult executeInVR(final String routerIp, final String script, final String args, final Duration timeout) { + final Script command = new Script(_routerProxyPath, timeout, s_logger); final AllLinesParser parser = new AllLinesParser(); command.add(script); command.add(routerIp);
[LibvirtComputingResource->[getInterfaces->[getInterfaces],isSnapshotSupported->[executeBashScript],getNetworkStats->[networkUsage],cleanupVMNetworks->[getAllVifDrivers],cleanupNetworkElementCommand->[vifHotUnPlug,VifHotPlug,getBroadcastUriFromBridge],getVPCNetworkStats->[configureVPCNetworkUsage],destroyNetworkRulesForVM->[getInterfaces],configure->[getDefaultKvmScriptsDir,getDefaultNetworkScriptsDir,configure,getDeveloperProperties,getDefaultStorageScriptsDir,getDefaultHypervisorScriptsDir,getDefaultDomrScriptsDir],getVifDriverClass->[configure],createVMFromSpec->[getUuid],vifHotUnPlug->[getAllVifDrivers],post_default_network_rules->[getInterfaces],getGuestDiskModel->[isGuestPVEnabled],checkBridgeNetwork->[matchPifFileInDirectory],configureTunnelNetwork->[findOrCreateTunnelNetwork],getDisks->[getDisks],defaultNetworkRules->[getInterfaces],rebootVM->[getPif,startVM],attachOrDetachISO->[cleanupDisk],prepareNetworkElementCommand->[VifHotPlug,getBroadcastUriFromBridge],stopVM->[stopVM],findOrCreateTunnelNetwork->[checkNetwork],getVmState->[convertToPowerState],getVersionStrings->[KeyValueInterpreter,getKeyValues],attachOrDetachDisk->[getUuid],initialize->[getVersionStrings,getUuid],executeInVR->[executeInVR],getHostVmStateReport->[convertToPowerState,getHostVmStateReport],getVmDiskStat->[getDomain,getDisks],getDeveloperProperties->[getEndIpFromStartIp],getVncPort->[getVncPort],getVmStat->[getInterfaces,getDomain,VmStats,getDisks],createVbd->[getUuid],syncNetworkGroups->[getRuleLogsForVms],getBroadcastUriFromBridge->[matchPifFileInDirectory]]]
Executes a script in the VR proxy.
Please consider adding an overridden version of the `Script(String, int, Logger)` constructor that accepts a `Duration` instance to encapsulate this conversion.
@@ -473,8 +473,13 @@ void GenericCAO::setAttachment(int parent_id, const std::string &bone, v3f posit o->removeAttachmentChild(m_id); if (parent) parent->addAttachmentChild(m_id); +#if 0 + printf("Attach id=%d to parent=%d (old=%d)\n", m_id, parent_id, old_parent); + std::cout << "\tpos=" << PP(position) << ", rot=" << PP(rotation) << std::endl; +#endif } + updateAttachments(); }
[getLightPosition->[v3f],clearParentAttachment->[setAttachment,v3f],clearChildAttachments->[setAttachment,v3f],updateLight->[getParent],updateNametag->[getSceneNode],processInitData->[updateNodePos,processMessage,init], ClientActiveObject->[getType],processMessage->[setAttachment,updateNametag,visualExpiryRequired,clearParentAttachment,update,clearChildAttachments,addAttachmentChild,init,updateTexturePos,v3f,updateTextures,updateAnimation,updateNodePos,getParent,updateAnimationSpeed],removeFromScene->[clearParentAttachment],setNodeLight->[getSceneNode],step->[getSceneNode,update,removeFromScene,translate,v3f,addToScene,updateNodePos,getParent],addToScene->[updateNodePos,getSceneNode,v3f],updateNodePos->[getSceneNode,getParent], removeFromScene->[removeFromScene],updateAttachments->[getAnimatedMeshSceneNode,getSceneNode,getParent], ClientActiveObject->[getType],directReportPunch->[updateTextures]]
Method to set the attachment properties of an object.
should probably be removed from this PR
@@ -82,7 +82,10 @@ function embed_longtext_menu($hook, $type, $items, $vars) { */ function embed_select_tab($hook, $type, $items, $vars) { $tab_name = elgg_extract('tab', $vars); - + if($tab_name == "") + { + $tab_name = "file"; + } foreach ($items as $item) { if ($item->getName() == $tab_name) { $item->setSelected();
[embed_set_thumbnail_url->[exists,getIcon],embed_longtext_menu->[isMember],embed_select_tab->[setSelected,all,getName]]
Select the first item in the list that is an embed tab.
better would be `$tab_name = elgg_extract('tab', $vars, 'file');`
@@ -6,10 +6,11 @@ type BaseTask struct { outputs []Task inputs []Task - id int - dotID string - Index int32 `mapstructure:"index" json:"-" ` - Timeout time.Duration `mapstructure:"timeout"` + id int + dotID string + Index int32 `mapstructure:"index" json:"-" ` + Timeout time.Duration `mapstructure:"timeout"` + FailEarly string `mapstructure:"failEarly"` } func NewBaseTask(id int, dotID string, inputs, outputs []Task, index int32) BaseTask {
[TaskTimeout->[Duration]]
NewBaseTask creates a new BaseTask from a given sequence of tasks.
@spooktheducks I wasn't able to parse to `bool` directly here, `UnmarshalTaskFromMap` would fail.
@@ -745,4 +745,13 @@ public class HttpObjectAggregatorTest { channel.close(); } } + + private static Buffer copiedBuffer(BufferAllocator allocator, String data, Charset charset) { + final byte[] bytes = data.getBytes(charset); + return allocator.allocate(bytes.length).writeBytes(bytes); + } + + private static Buffer copiedBuffer(BufferAllocator allocator, byte[] bytes) { + return allocator.allocate(bytes.length).writeBytes(bytes); + } }
[HttpObjectAggregatorTest->[testAggregateTransferEncodingChunked->[checkContentBuffer],testAggregateWithTrailer->[checkContentBuffer]]]
Selective response aggregation. Close the channel and close any resources that can be found.
This last one is no longer necessary now that we have `BufferAllocator.copyOf`
@@ -34,7 +34,7 @@ except ImportError: GIT_AVAILABLE = False -DEBUG = False # change this to true to print to stdout anyway +DEBUG = True # change this to true to print to stdout anyway def is_this_circleci():
[skipIfCircleCI->[is_this_circleci],capture_output->[TeeStringIO],eval_model->[eval_model,capture_output],TeeStringIO->[write->[write]],download_unittest_models->[capture_output],display_data->[capture_output,display_data],train_model->[capture_output,tempdir],git_ls_dirs->[git_ls_files]]
Returns if we are currently running in CircleCI.
@stephenroller Is this supposed to stay True?
@@ -65,11 +65,11 @@ class Order(models.Model, ItemSet): def save(self, *args, **kwargs): if not self.token: - for _i in range(100): - token = str(uuid4()) - if not type(self).objects.filter(token=token).exists(): - self.token = token - break + token = str(uuid4()) + if not type(self).objects.filter(token=token).exists(): + self.token = token + else: + raise RuntimeError('Could not create unique token.') return super(Order, self).save(*args, **kwargs) def change_status(self, status):
[OrderedItem->[change_quantity->[save,get_total_quantity,get_items,update_delivery_cost,change_status],OrderedItemManager],DeliveryGroup->[get_weight->[get_weight],change_status->[save],update_delivery_cost->[save,get_weight,get_delivery_total],DeliveryGroupManager],Payment->[get_purchased_items->[get_items],send_confirmation_email->[get_user_email]],DeliveryGroupManager->[duplicate_group->[save]],Order->[send_confirmation_email->[get_user_email],change_status->[save]],OrderedItemManager->[move_to_group->[save,get_total_quantity,get_items,update_delivery_cost,change_status]]]
Add a random token to the order if it doesn t already exist.
Unique constraint will take care of that.
@@ -51,6 +51,12 @@ public class EnvVarsInConfigTasksTest extends HudsonTestCase { slaveRegular = createSlave(new LabelAtom("slaveRegular")); } + @Override + protected void tearDown() throws Exception { + super.tearDown(); + tmp.delete(); + } + private String withVariable(String s) { return s + "${" + DUMMY_LOCATION_VARNAME + "}"; }
[EnvVarsInConfigTasksTest->[testFreeStyleAntOnSlave->[assertTrue,matches,getInstallations,getBuildLog,setAssignedLabel,Ant,getDisplayName,assertFalse,getJDK,getResource,setJDK,println,assumeFalse,assertBuildStatus,assertBuildStatusSuccess,ExtractResourceSCM,getSelfLabel,get,createFreeStyleProject,Ant_ExecutableNotFound,contains,add,setScm],testNativeMavenOnSlave->[createMavenProject,getResource,assertBuildStatusSuccess,setJDK,setMaven,ExtractResourceSCM,assertFalse,getSelfLabel,setAssignedLabel,get,println,assertBuildStatus,setGoals,contains,getDisplayName,setScm,getBuildLog,getJDK],testFreeStyleShellOnSlave->[assertTrue,getBuildLog,setAssignedLabel,isFamily,Shell,getDisplayName,assertFalse,getJDK,getResource,setJDK,println,BatchFile,assertBuildStatusSuccess,ExtractResourceSCM,getSelfLabel,get,createFreeStyleProject,contains,add,setScm],setUp->[setInstallations,MavenInstallation,configureDefaultMaven,configureDefaultAnt,EnvVars,getHome,createSlave,getInstallations,AntInstallation,setUp,add,withVariable,LabelAtom,getJDK,JDK],testFreeStyleMavenOnSlave->[assertTrue,getBuildLog,setAssignedLabel,getDisplayName,assertFalse,getJDK,getResource,Maven,setJDK,println,assertBuildStatus,assertBuildStatusSuccess,ExtractResourceSCM,getSelfLabel,get,createFreeStyleProject,contains,add,setScm],getBuildLog->[getLog]]]
This method is called by the Jenkins installation when it is not already setup. This method is used to test if a slave with prepared environment is expanded and if the slave.
Urg - reformat the whole of this class - or match the surroundings?
@@ -116,7 +116,7 @@ class CompositeMetric(MetricBase): super(CompositeMetric, self).__init__(name, kwargs) self._metrics = [] - def add_metric(self, metric): + def update(self, metric): if not isinstance(metric, MetricBase): raise ValueError("SubMetric should be inherit from MetricBase.") self._metrics.append(metric)
[EditDistance->[update->[_is_number_,_is_numpy_]],Auc->[update->[_is_numpy_]],Accuracy->[update->[_is_number_or_matrix_,_is_number_]],CompositeMetric->[eval->[eval]],ChunkEvaluator->[update->[_is_number_or_matrix_]],_is_number_or_matrix_->[_is_number_],DetectionMAP->[update->[_is_number_or_matrix_,_is_number_]]]
Initialize a composite metric.
I add the interface `add_metric` on purpose, means that add a new one metric to Composed metrics.
@@ -612,7 +612,9 @@ public class BigtableIOTest { final String table = "TEST-TABLE"; PCollection<KV<ByteString, Iterable<Mutation>>> emptyInput = - p.apply(Create.<KV<ByteString, Iterable<Mutation>>>of()); + p.apply( + Create.empty( + KvCoder.of(ByteStringCoder.of(), IterableCoder.of(ProtoCoder.of(Mutation.class))))); // Exception will be thrown by write.validate() when write is applied. thrown.expect(IllegalArgumentException.class);
[BigtableIOTest->[testWriting->[apply],FakeBigtableWriter->[writeRecord->[getTable,verifyTableExists]],testReadingFailsTableDoesNotExist->[apply],runReadTest->[apply],FakeBigtableService->[verifyTableExists->[tableExists],setupSampleRowKeys->[getTable,verifyTableExists]],testReadingWithFilter->[apply->[apply],KeyMatchesRegex,runReadTest],testWritingFailsTableDoesNotExist->[apply],testWritingFailsBadElement->[apply],FakeBigtableReader->[advance->[makeRow,apply],KeyMatchesRegex,verifyTableExists],makeTableData->[makeRow],testReadingWithKeyRange->[filterToRange,runReadTest]]]
Test writing fails if table does not exist.
side thought: this could also take a TypeDescriptor if Create had access to the default coder registry :p
@@ -50,13 +50,13 @@ spell: # NB: "third_party" only exists for automate-gateway, but no harm having it for other dirs here. semgrep: # uncomment if custom rules beyond automate-ui ever get added -# semgrep --config $(REPOROOT)/semgrep --exclude third_party - semgrep --config https://semgrep.dev/p/r2c-ci --exclude third_party --autofix +# semgrep --config $(REPOROOT)/semgrep --exclude third_party --exclude *_test.go --exclude *.pb.go + semgrep --config https://semgrep.dev/p/r2c-ci --exclude third_party --exclude *_test.go --exclude *.pb.go #: Security validation via semgrep; autofix where possible semgrep-and-fix: # uncomment if custom rules beyond automate-ui ever get added -# semgrep --config $(REPOROOT)/semgrep --exclude third_party --autofix - semgrep --config https://semgrep.dev/p/r2c-ci --exclude third_party --autofix +# semgrep --config $(REPOROOT)/semgrep --exclude third_party --exclude *_test.go --exclude *.pb.go --autofix + semgrep --config https://semgrep.dev/p/r2c-ci --exclude third_party --exclude *_test.go --exclude *.pb.go --autofix .PHONY: lint fmt fmt-check golang_version_check semgrep semgrep-and-fix
[No CFG could be retrieved]
Security validation via semgrep.
Am I wrong with the idea that semgrep has a config file, too? Using that instead, we may be able to share it between the Makefile and the BK pipeline definition...? Or, rather, and perhaps preferrably, we could use the make target from within the BK job, I would think!
@@ -47,7 +47,9 @@ class <%= entityClass %>GatlingTest extends Simulation { .acceptEncodingHeader("gzip, deflate") .acceptLanguageHeader("fr,fr-fr;q=0.8,en-us;q=0.5,en;q=0.3") .connectionHeader("keep-alive") - .userAgentHeader("Mozilla/5.0 (Macintosh; Intel Mac OS X 10.10; rv:33.0) Gecko/20100101 Firefox/33.0") + .userAgentHeader("Mozilla/5.0 (Macintosh; Intel Mac OS X 10.10; rv:33.0) Gecko/20100101 Firefox/33.0")<%_ if (authenticationType === 'oauth2') { _%> + .disableFollowRedirect // We must follow redirects manually to get the xsrf token from the keycloak redirect + .disableAutoReferer<%_ } _%> val headers_http = Map( "Accept" -> """application/json"""
[No CFG could be retrieved]
Creates a new GATLING test. Create a map of HTTP headers that can be used to authenticate the request.
the condition with `<%_` should be in a new line
@@ -78,6 +78,15 @@ class ProofOfPossessionTest(unittest.TestCase): self.assertFalse(self.proof_of_pos.perform(self.achall)) self.assertTrue(self.proof_of_pos.perform(self.achall).verify()) + def test_perform_with_key_dir(self): + # Remove the matching certificate + self.installer.get_all_certs_keys.return_value.pop() + + self.config.key_dir = tempfile.mkdtemp('key_dir') + crypto_util.init_save_key(2048, self.config.key_dir) + + self.assertTrue(self.proof_of_pos.perform(self.achall).verify()) + if __name__ == "__main__": unittest.main() # pragma: no cover
[ProofOfPossessionTest->[test_perform_bad_challenge->[perform,Hints,ProofOfPossession,assertEqual,ChallengeBody,JWKOct],test_perform_no_input->[assertTrue,perform],setUp->[mkstemp,Hints,ProofOfPossession,ChallengeBody,JWKRSA,zip,MagicMock],test_perform_with_input->[assertTrue,perform,mock_input,pop,assertFalse],tearDown->[remove],patch],vector_path,main,load_rsa_private_key]
Test perform with input.
There's no need to generate a new key here. `CERT2_KEY_PATH` and `CERT3_KEY_PATH` are paths to already generated private keys. `CERT3_KEY_PATH` is the one that matches the public key for the challenge used in these tests.
@@ -808,6 +808,8 @@ class Jetpack_SSO { exit; } + add_filter( 'jetpack_sso_default_to_sso_login', '__return_false' ); + JetpackTracking::record_user_event( 'sso_login_failed', array( 'error_message' => 'cant_find_user' ) );
[Jetpack_SSO->[msg_login_by_jetpack->[get_sso_required_message],build_reauth_and_sso_url->[build_sso_url],login_init->[wants_to_login]]]
Handles the login request. This method is used to filter the user object. This function creates a user if it doesn t exist yet. Updates the user s meta data and updates the user s login status. This filter is documented in core - src - wp - includes.
Should we update the filter as well. I think its fine.
@@ -19,7 +19,7 @@ $factory->define(App\User::class, function (Faker $faker) { return [ 'name' => $faker->name, 'email' => $faker->unique()->safeEmail, - 'password' => $password ?: $password = bcrypt('secret'), + 'password' => $password ?? bcrypt('secret'), 'remember_token' => str_random(10), ]; });
[unique,define]
Get the unique user id.
`$password ?? $password = bcrypt('secret');`, right?
@@ -30,7 +30,7 @@ module Idv end def message - t('headings.lock_failure') + nil end def next_steps
[JurisdictionFailurePresenter->[message->[t],i18n_args->[state_name_for_abbrev],description->[t],title->[t],header->[t],attr_reader,delegate]]
missing message lock failure.
What is going on here? Why did we remove this translation?
@@ -99,12 +99,14 @@ class ImageReader(MeshReader): for x in range(0, width): for y in range(0, height): qrgb = img.pixel(x, y) - avg = float(qRed(qrgb) + qGreen(qrgb) + qBlue(qrgb)) / (3 * 255) - height_data[y, x] = avg + if use_transparency_model: + height_data[y, x] = (0.299 * math.pow(qRed(qrgb) / 255.0, 2.2) + 0.587 * math.pow(qGreen(qrgb) / 255.0, 2.2) + 0.114 * math.pow(qBlue(qrgb) / 255.0, 2.2)) + else: + height_data[y, x] = (0.212655 * qRed(qrgb) + 0.715158 * qGreen(qrgb) + 0.072187 * qBlue(qrgb)) / 255 # fast computation ignoring gamma and degamma Job.yieldThread() - if not lighter_is_higher: + if lighter_is_higher is use_transparency_model: height_data = 1 - height_data for _ in range(0, blur_iterations):
[ImageReader->[_generateSceneNode->[round,max,setMeshData,pixel,SceneNode,MeshBuilder,scaled,int,calculateNormals,log,set,qGreen,QImage,qBlue,reserveFaceCount,zeros,range,isNull,height,repeat,yieldThread,concatenate,float,width,build,qRed,pad,reshape,Vector,addFaceByPoints,array],__init__->[ImageReaderUI,super],_read->[_generateSceneNode,getWidth,getDepth,max],preRead->[isNull,showConfigUI,height,max,getCancelled,setWidthAndDepth,log,waitForUIToClose,QImage,width]]]
Generate a scene node. Generate a height - map of the non - transparent components of a color color color color color This function is used to compute the offsets for the texel quadrature and heightmap This function calculates the non - zero components of a node in a scene.
use == for equality, `is` for identity checks
@@ -117,6 +117,16 @@ func (s *Suite) DeleteDataFromStorage() { // IngestService ingests a single HealthCheckEvent message into the database func (s *Suite) IngestService(event *habitat.HealthCheckEvent) { + eventsProcessed := s.Ingester.EventsProcessed() + bytes, err := proto.Marshal(event) + if err != nil { + fmt.Printf("Error trying to ingest hab service event: %s\n", err) + } + s.Ingester.IngestMessage(bytes) + s.WaitForEventsToProcess(eventsProcessed + 1) +} + +func (s *Suite) IngestServiceViaStorageClient(event *habitat.HealthCheckEvent) { err := s.StorageClient.IngestHealthCheckEvent(event) if err != nil { fmt.Printf("Error trying to ingest hab service event: %s\n", err)
[GetServiceGroups->[GetServiceGroups],IngestServices->[IngestService],GetServices->[GetServices]]
IngestService ingest a hab service event.
I don't think this is concurrency-safe -- can only one of these ever happen at the same time? If this could be called twice concurrently, we might miss an event.
@@ -877,9 +877,15 @@ class DistributeTranspiler(object): # create table param and grad var in pserver program origin_param_var = self.origin_program.global_block().vars[ self.table_name] + + zero_dim = long( + math.ceil(origin_param_var.shape[0] / len(self.pserver_endpoints))) + table_shape = list(origin_param_var.shape) + table_shape[0] = zero_dim + param_var = pserver_program.global_block().create_var( name=origin_param_var.name, - shape=origin_param_var.shape, + shape=table_shape, dtype=origin_param_var.dtype, type=core.VarDesc.VarType.SELECTED_ROWS, persistable=True)
[DistributeTranspiler->[_append_pserver_ops->[_get_param_block->[same_or_split_var],_get_optimizer_input_shape,_get_param_block],get_startup_program->[_get_splited_name_and_shape->[same_or_split_var],_get_splited_name_and_shape],_create_ufind->[_is_op_connected],_orig_varname->[_get_varname_parts],_get_optimize_pass->[_is_opt_role_op],_is_splited_grad_var->[_orig_varname],get_pserver_program->[__clone_lr_op_sub_block__->[__clone_lr_op_sub_block__],__op_have_grad_input__,__append_optimize_op__,__clone_lr_op_sub_block__],get_trainer_program->[__str__],_init_splited_vars->[_update_dist_lookup_table_vars,slice_variable],__init__->[DistributeTranspilerConfig],_get_lr_ops->[_is_op_connected,_is_optimizer_op],_append_pserver_non_opt_ops->[_is_splited_grad_var],_is_opt_op_on_pserver->[same_or_split_var],_append_pserver_grad_merge_ops->[_get_varname_parts,_orig_varname]],slice_variable->[VarBlock]]
Create table optimize block for n - N chains. table_opt_block grad_to_block_idx is a list of table_.
do not use long since it is not compatible with `py3`
@@ -113,7 +113,8 @@ def update_monitoring_service_from_balance_proof( "Skipping update to Monitoring service. " "Your channel balance {channel_balance} is less than " "the required minimum balance of {min_balance} " - "that you have set before sending the MonitorRequest" + "that you have set before sending the MonitorRequest," + " token address {token_address}" ) dai_token_network_address = views.get_token_network_address_by_token_address(
[update_monitoring_service_from_balance_proof->[from_balance_proof_signed_state,get_channelstate_by_canonical_identifier,broadcast,to_rdn,format,get_token_network_address_by_token_address,sign,info,effective_balance,warning,get_balance,to_checksum_address],send_pfs_update->[debug,get_channelstate_by_canonical_identifier,broadcast,state_from_raiden,from_channel_state,sign,to_checksum_address],get_logger]
Update monitoring service from a new balance proof. This method is called when a node is being monitored.
Can you make this a whole sentence? Like "... before sending the MonitorRequest, " "in the Token Network {token_address}."
@@ -278,12 +278,12 @@ func (ls *Source) SearchEntry(name, passwd string, directBind bool) *SearchResul var isAttributeSSHPublicKeySet = len(strings.TrimSpace(ls.AttributeSSHPublicKey)) > 0 - attribs := []string{ls.AttributeUsername, ls.AttributeName, ls.AttributeSurname, ls.AttributeMail} + attribs := []string{ls.AttributeUsername, ls.AttributeName, ls.AttributeSurname, ls.AttributeMail, ls.UserUID} if isAttributeSSHPublicKeySet { attribs = append(attribs, ls.AttributeSSHPublicKey) } - log.Trace("Fetching attributes '%v', '%v', '%v', '%v', '%v' with filter %s and base %s", ls.AttributeUsername, ls.AttributeName, ls.AttributeSurname, ls.AttributeMail, ls.AttributeSSHPublicKey, userFilter, userDN) + log.Trace("Fetching attributes '%v', '%v', '%v', '%v', '%v' with filter '%s' and base '%s'", ls.AttributeUsername, ls.AttributeName, ls.AttributeSurname, ls.AttributeMail, ls.UserUID, userFilter, userDN) search := ldap.NewSearchRequest( userDN, ldap.ScopeWholeSubtree, ldap.NeverDerefAliases, 0, 0, false, userFilter, attribs, nil)
[SearchEntry->[sanitizedUserDN,sanitizedUserQuery,findUserDN],SearchEntries->[UsePagedSearch],findUserDN->[sanitizedUserQuery]]
SearchEntry searches for a user in the LDAP server. If directBind is true the user Finds the user s attributes and connects to the user. get a SearchResult object.
If `ls.UserUID` is blank is this actually ok?
@@ -212,8 +212,10 @@ export class AmpStory extends AMP.BaseElement { /** @private @const {!UnsupportedBrowserLayer} */ this.unsupportedBrowserLayer_ = new UnsupportedBrowserLayer(this.win); - /** Instantiates the viewport warning layer. */ - new ViewportWarningLayer(this.win, this.element); + if (!isExperimentOn(this.win, 'disable-amp-story-desktop')) { + /** Instantiates the viewport warning layer. */ + new ViewportWarningLayer(this.win, this.element); + } /** @private @const {!Array<!./amp-story-page.AmpStoryPage>} */ this.pages_ = [];
[AmpStory->[pauseCallback->[TOGGLE_PAUSED,PAUSED_STATE],buildSystemLayer_->[element],onPausedStateUpdate_->[ACTIVE,PAUSED],registerAndPreloadBackgroundAudio_->[upgradeBackgroundAudio,childElement,tagName],constructor->[documentStateFor,timerFor,getStoreService,registerServiceBuilder,platformFor,for,createPseudoLocale],initializeStandaloneStory_->[classList],isSwipeLargeEnoughForHint_->[abs],next_->[dev,element,next,ADVANCE_TO],setHistoryStatePageId_->[replaceState,isAd],onBookendStateUpdate_->[ACTIVE,PAUSED],getPageDistanceMapHelper_->[getAdjacentPageIds],showBookend_->[TOGGLE_BOOKEND],initializeListeners_->[CURRENT_PAGE_ID,AD_STATE,NEXT_PAGE,PREVIOUS_PAGE,MUTED_STATE,getMode,SHOW_NO_PREVIOUS_PAGE_HELP,PAGE_PROGRESS,DESKTOP_STATE,BOOKEND_STATE,CAN_SHOW_PREVIOUS_PAGE_HELP,DISPATCH_ACTION,REPLAY,PAUSED_STATE,getDetail,TAP_NAVIGATION,SUPPORTED_BROWSER_STATE,SWITCH_PAGE,debounce],isDesktop_->[isExperimentOn],addPage->[isAd],isBrowserSupported->[Boolean,CSS],getPageContainingElement_->[findIndex,element,closest],handlePreviewAttributes_->[removeAttributeInMutate,NEXT,getPreviousPageId,setAttributeInMutate,getNextPageId,PREVIOUS],isStandalone_->[STANDALONE],validateConsent_->[tagName,dev,indexOf,childElementByTag,removeChild,forEach,length,childElements],updateAudioIcon_->[TOGGLE_HAS_AUDIO],onResize->[TOGGLE_LANDSCAPE,TOGGLE_DESKTOP,isLandscape],buildCallback->[setAttribute,TOGGLE_DESKTOP,TOGGLE_RTL,actionServiceForDoc,isRTL,setWhitelist],getPagesByDistance_->[keys],replay_->[then,BOOKEND_STATE,removeAttributeInMutate,dev,VISITED],insertPage->[setAttribute,RETURN_TO,AUTO_ADVANCE_TO,isAd,ADVANCE_TO,dev,CAN_INSERT_AUTOMATIC_AD,element,id],onKeyDown_->[BOOKEND_STATE,RTL_STATE,LEFT_ARROW,RIGHT_ARROW,keyCode],updateBackground_->[url,computedStyle,color],onSupportedBrowserStateUpdate_->[dev],layoutCallback->[resolve,isBrowserSupported,TOGGLE_SUPPORTED_BROWSER],toggleElementsOnBookend_->[scopedQuerySelectorAll,resetStyles,prototype,setImportantStyles],previous_->[dev,previous],buildPaginationButtons_->[create],switchTo_->[setState,muteAllMedia,isAd,NOT_ACTIVE,TOGGLE_AD,removeAttributeInMutate,AD_SHOWING,CHANGE_PAGE,MUTED_STATE,VISITED,ACTIVE,setAttributeInMutate,PAUSED_STATE,beforeVisible,getNextPageId,element,isExperimentOn],performTapNavigation_->[NEXT,PREVIOUS],initializeBookend_->[dict,getImpl,createElementWithAttributes],preloadPagesByDistance_->[forEach,setDistance],getPageIndexById->[findIndex,user,element],hasBookend_->[CAN_SHOW_BOOKEND,components,resolve],getBackgroundUrl_->[dev,querySelector,getAttribute],updateViewportSizeStyles_->[vmax,vmin,vw,max,min,vh,px],isLayoutSupported->[CONTAINER],triggerActiveEventForPage_->[actionServiceForDoc,HIGH],getElementDistance->[getDistance],installGestureRecognizers_->[BOOKEND_STATE,CAN_SHOW_NAVIGATION_OVERLAY_HINT,get,data,preventDefault,onGesture],pauseStoryUntilConsentIsResolved_->[getConsentPolicyState,then,TOGGLE_PAUSED],getMaxMediaElementCounts->[min,VIDEO,AUDIO],lockBody_->[setImportantStyles,documentElement,body],initializeListenersForDev_->[getMode,getDetail,DEV_LOG_ENTRIES_AVAILABLE],getPageById->[dev],maybeLockScreenOrientation_->[mozLockOrientation,dev,message,lockOrientation,msLockOrientation],layoutStory_->[setState,then,NOT_ACTIVE,build,user,viewerForDoc,DESKTOP_STATE,id],onAdStateUpdate_->[MUTED_STATE],initializeStyles_->[querySelector],hideBookend_->[TOGGLE_BOOKEND],getPageIndex->[findIndex],rewriteStyles_->[textContent,isExperimentOn],forceRepaintForSafari_->[setStyle],initializePages_->[all,prototype,getImpl],resumeCallback->[TOGGLE_PAUSED],markStoryAsLoaded_->[INI_LOAD,STORY_LOADED,dispatch],getNextPage->[getNextPageId],whenPagesLoaded_->[filter,all,whenLoaded],getHistoryStatePageId_->[state],experimentalSwitchTo_->[setState,isAd,shift,MUTED_STATE,TOGGLE_AD,removeAttributeInMutate,VISITED,muteAllMedia,NOT_ACTIVE,CHANGE_PAGE,PAUSED_STATE,beforeVisible,element,length,resolve,unqueueStepInRAF,AD_SHOWING,ACTIVE,setAttributeInMutate,getNextPageId],BaseElement],registerElement,VIDEO,AUDIO,extension]
Initializes the object. Private methods for handling a bunch of missing parameters. Register all strings in the system with the Nichael MessageFormat.
This allows mobile landscape though, we need to keep it in mind if we ever allow launching this experiment
@@ -13,12 +13,13 @@ import ( // CrossLink is only used on beacon chain to store the hash links from other shards // signature and bitmap correspond to |blockNumber|parentHash| byte array -// Captial to enable rlp encoding +// Capital to enable rlp encoding // Here we replace header to signatures only, the basic assumption is the committee will not be // corrupted during one epoch, which is the same as consensus assumption type CrossLink struct { HashF common.Hash BlockNumberF *big.Int + ViewIDF *big.Int SignatureF [96]byte //aggregated signature BitmapF []byte //corresponding bitmap mask for agg signature ShardIDF uint32 //will be verified with signature on |blockNumber|blockHash| is correct
[IsSorted->[ShardID,Number],Sort->[ShardID,Number]]
Types import imports a cross link object from a block. Header. Epoch returns epoch with big. Int format.
New implies returns pointer , make it a pointer and all these crosslink methods should have pointer receiver
@@ -174,7 +174,7 @@ public class DatabaseConfiguration { @Bean(initMethod = "start", destroyMethod = "stop") @Profile(JHipsterConstants.SPRING_PROFILE_DEVELOPMENT) public Object h2TCPServer() throws SQLException { - String port = "1" + env.getProperty("server.port"); + String port = String.valueOf(SocketUtils.findAvailableTcpPort()); log.debug("H2 database is available on port {}", port); return H2ConfigurationHelper.createServer(port); }
[No CFG could be retrieved]
Creates a database configuration bean. ValidatorFactoryBean. Provides a validator for the .
maybe we should make this logging from debug to info to print the port
@@ -250,14 +250,14 @@ namespace DotNetNuke.UI.Modules private void InjectModuleContent(Control content) { - if (this._moduleConfiguration.IsWebSlice && !Globals.IsAdminControl()) + if (!Globals.IsAdminControl()) { // Assign the class - hslice to the Drag-N-Drop Panel this.CssClass = "hslice"; var titleLabel = new Label { CssClass = "entry-title Hidden", - Text = !string.IsNullOrEmpty(this._moduleConfiguration.WebSliceTitle) ? this._moduleConfiguration.WebSliceTitle : this._moduleConfiguration.ModuleTitle, + Text = !string.IsNullOrEmpty(this._moduleConfiguration.ModuleTitle) ? this._moduleConfiguration.ModuleTitle : string.Empty, }; this.Controls.Add(titleLabel);
[ModuleHost->[RenderContents->[RenderContents,IsViewMode],LoadModuleControl->[LoadModuleControl,IsVersionRequest,IsViewMode,DisplayContent],LoadUpdatePanel->[InjectModuleContent,InjectMessageControl],OnPreRender->[OnPreRender]]]
Adds the module content into the list of controls that the module should have.
I think this whole if block can re removed as it was only for when the module was a webslice right?
@@ -100,8 +100,14 @@ public class LocalIndexedRepo extends FixedIndexedRepo implements Refreshable, P } /** - * @param contentProvider the repository content provider - * @return the filename of the index on local storage + * <<<<<<< HEAD + * + * @param contentProvider + * the repository content provider + * @return the filename of the index on local storage ======= + * @param contentProvider + * the repository content provider @return the filename of the + * index on local storage >>>>>>> stash */ private File getIndexFile(IRepositoryContentProvider contentProvider) { String indexFileName = contentProvider.getDefaultIndexName(pretty);
[LocalIndexedRepo->[tooltip->[getLocation,regenerateAllIndexes,refresh],regenerateAllIndexes->[getIndexFile],putArtifact->[finishPut],generateIndex->[generateIndex],getLocation->[getLocation],title->[getHandle,getLocation],refresh->[regenerateAllIndexes],listRecurse->[listRecurse],setProperties->[setProperties],actions->[run->[regenerateAllIndexes],put],put->[putArtifact],finishPut->[regenerateAllIndexes],loadIndexes->[loadIndexes],ended->[finishPut]]]
Get the default index file.
This is left overs from a merge error.
@@ -47,13 +47,13 @@ public class S3DataSegmentFinder implements DataSegmentFinder { private static final Logger log = new Logger(S3DataSegmentFinder.class); - private final AmazonS3 s3Client; + private final EncryptingAmazonS3 s3Client; private final ObjectMapper jsonMapper; private final S3DataSegmentPusherConfig config; @Inject public S3DataSegmentFinder( - AmazonS3 s3Client, + EncryptingAmazonS3 s3Client, S3DataSegmentPusherConfig config, ObjectMapper jsonMapper )
[S3DataSegmentFinder->[findSegments->[getObjectMetadata,equals,getLoadSpec,info,getMaxListingLength,put,propagateIfInstanceOf,getObjectContent,getBucket,getTime,getKey,SegmentLoadingException,putObject,indexZipForSegmentPath,writeValueAsString,getIdentifier,ByteArrayInputStream,getBaseKey,toSet,hasNext,putInMapRetainingNewest,getObject,length,next,isObjectInBucketIgnoringPermission,objectSummaryIterator,propagate,getLastModified,get,collect,toUtf8,readValue],Logger]]
Produces a set of segments from the specified S3 path. This method creates an S3Client object for the given .
Instead of changing this everywhere, can you just bind `EncryptingAmazonS3` to `AmazonS3`? that way custom modules won't break
@@ -152,6 +152,12 @@ final class ApiLoader extends Loader 'collection' => $isCollection, ]; + $visiting = "$resourceClass$subresource"; + + if (in_array($visiting, $visited, true)) { + continue; + } + if (null === $parentOperation) { $rootResourceMetadata = $this->resourceMetadataFactory->create($rootResourceClass); $rootShortname = $rootResourceMetadata->getShortName();
[ApiLoader->[addRoute->[routeNameResolver],computeSubresourceOperations->[computeSubresourceOperations,routeNameResolver],loadExternalFiles->[load]]]
Computes the subresource operations. route - > route.
Oops... The visited tracking is not specific enough. For example, if I have: Product -[isRelatedTo]-> Product and Product -[isSimilarTo]-> Product It only works for the first of its type.
@@ -65,6 +65,9 @@ public class Drools implements RulesEngine KnowledgeBaseConfiguration conf = KnowledgeBaseFactory.newKnowledgeBaseConfiguration(null, Thread.currentThread().getContextClassLoader()); + + conf.setOption(AssertBehaviorOption.EQUALITY); + if (rules.getConfiguration() != null) { conf.setOption((KnowledgeBaseOption) rules.getConfiguration());
[Drools->[retractFact->[fireAllRules,warn,getFactHandle,retract,getSession],createSession->[getMessage,isStateless,getResourceAsStream,newInputStreamResource,IOException,toString,getKnowledgePackages,newKnowledgeBuilderConfiguration,hasErrors,fireAllRules,WorkingMemorySLF4JLogger,getResource,newStatefulKnowledgeSession,newKnowledgeBaseConfiguration,isCepMode,createStaticMessage,getConfiguration,newKnowledgeBase,setGlobal,getContextClassLoader,DroolsSessionData,setOption,addKnowledgePackages,newKnowledgeBuilder,add,getClass,ConfigurationException],assertFact->[insert,update,fireAllRules,getFactHandle,getSession],disposeSession->[dispose,close],assertEvent->[size,insert,update,createStaticMessage,getFactHandle,fireAllRules,getWorkingMemoryEntryPoints,next,getWorkingMemoryEntryPoint,getSession,ConfigurationException],getLogger]]
Creates a session based on the given rules.
I guess that different modes exist for a reason... woyld be good to have a way to configure drools using tje old mode (which could be better for perfoema ce reasons umder some scenarios)
@@ -404,13 +404,16 @@ static ssize_t syscall_random(void *buf, size_t buflen) ERR_pop_to_mark(); if (p_getentropy.p != NULL) return p_getentropy.f(buf, buflen) == 0 ? (ssize_t)buflen : -1; -# endif +# endif +# endif /* !__DragonFly__ */ /* Linux supports this since version 3.17 */ # if defined(__linux) && defined(__NR_getrandom) return syscall(__NR_getrandom, buf, buflen, 0); # elif (defined(__FreeBSD__) || defined(__NetBSD__)) && defined(KERN_ARND) return sysctl_random(buf, buflen); +# elif (defined(__DragonFly__) && __DragonFly_version >= 500700) + return getrandom(buf, buflen, 0); # else errno = ENOSYS; return -1;
[No CFG could be retrieved]
- Glibc 2. 5 - > 12. 0 with glibc 2. - - - - - - - - - - - - - - - - - -.
Is it worth conditioning this on the system call define as is done with `__NR_getrandom` above?
@@ -90,6 +90,8 @@ public abstract class AbstractMarkerBasedRollbackStrategy<T extends HoodieRecord String fileId = FSUtils.getFileIdFromFilePath(baseFilePathForAppend); String baseCommitTime = FSUtils.getCommitTime(baseFilePathForAppend.getName()); String partitionPath = FSUtils.getRelativePartitionPath(new Path(basePath), new Path(basePath, appendBaseFilePath).getParent()); + final Map<FileStatus, Long> writtenLogFileSizeMap = config.useFileListingMetadata() + ? getWrittenLogFileSizeMap(partitionPath, baseCommitTime, fileId) : Collections.EMPTY_MAP; HoodieLogFormat.Writer writer = null; try {
[AbstractMarkerBasedRollbackStrategy->[undoAppend->[generateHeader,Path,getParent,getFileIdFromFilePath,getFileStatus,getPartitionPath,getRelativePartitionPath,useFileListingMetadata,getTimestamp,close,exists,HoodieCommandBlock,HoodieIOException,getName,build,singletonMap,appendBlock,emptyMap,getCommitTime,getPath],undoCreate->[info,deleteBaseFile],deleteBaseFile->[Path,getParent,getRelativePartitionPath,build,delete],undoMerge->[info,deleteBaseFile],getLogger,getBasePath]]
This method is called when a rollback is performed. Returns a new instance of the class that will be used to create the class.
Hmmm. can we remove this flag as we discussed before? using the file listing flag here is very confusing and crosses a lot of layers. if you are worried about the new code breaking, then a catch block returning an empty map is good? Downside is it will make it hard to detect issues here.
@@ -10,6 +10,7 @@ from contextlib import closing, contextmanager from urllib.parse import urlparse from funcy import memoize, wrap_with, silent, first +import paramiko import dvc.prompt as prompt from dvc.progress import Tqdm
[RemoteSSH->[isfile->[isfile,ssh],remove->[remove,ssh],symlink->[symlink,ssh],batch_exists->[ssh],copy->[ssh],move->[move,ssh],ensure_credentials->[ask_password],_download->[ssh],exists->[exists,ssh],hardlink->[hardlink,getsize,ssh],list_cache_paths->[ssh],open->[ssh,open],reflink->[ssh,reflink],_load_user_ssh_config->[ssh_config_filename],ssh->[ensure_credentials],get_file_checksum->[ssh],cache_exists->[exists_with_progress->[batch_exists],ensure_credentials],_upload->[ssh],walk_files->[ssh,walk_files],getsize->[getsize,ssh],makedirs->[ssh,makedirs],isdir->[isdir,ssh]]]
Creates a new object. Get the from the config.
Let's move this under `RemoteSSH.__init__` as a dynamic import.
@@ -138,6 +138,10 @@ var getCdnVersion = function() { .reverse() .reduce(function(cdnVersion, version) { if (!cdnVersion) { + if (OFFLINE) { + // We are offline so just use the most recent version + cdnVersion = version; + } // Note: need to use shell.exec and curl here // as version-infos returns its result synchronously... var cdnResult = shell.exec('curl http://ajax.googleapis.com/ajax/libs/angularjs/' + version + '/angular.min.js ' +
[No CFG could be retrieved]
Get the latest version of angularjs that matches the given tag. Get the unstable snapshot version.
Shouldn't we return here?
@@ -0,0 +1,18 @@ + +module ConfigParams + extend ActiveSupport::Concern + def config_params + special_params_to_remove = %w[authentication_providers email_addresses meta_keywords credit_prices_in_cents] + special_params_to_add = %w[auth_providers_to_enable] + has_emails = params.dig(:site_config, :email_addresses).present? + params[:site_config][:email_addresses][:default] = ApplicationConfig["DEFAULT_EMAIL"] if has_emails + params&.require(:site_config)&.permit( + (SiteConfig.keys - special_params_to_remove + special_params_to_add).map(&:to_sym), + authentication_providers: [], + social_media_handles: SiteConfig.social_media_handles.keys, + email_addresses: SiteConfig.email_addresses.keys, + meta_keywords: SiteConfig.meta_keywords.keys, + credit_prices_in_cents: SiteConfig.credit_prices_in_cents.keys, + ) + end +end
[No CFG could be retrieved]
No Summary Found.
this can be promoted to constants I guess
@@ -3,7 +3,7 @@ class InlineInput < SimpleForm::Inputs::StringInput input_html_classes.push('col-10 field monospace') template.content_tag( :div, builder.text_field(attribute_name, input_html_options), - class: 'col col-12 sm-col-4 mb4 sm-mb0' + class: 'col col-12 sm-col-5 mb4 sm-mb0' ) end
[InlineInput->[input->[content_tag,text_field,push]]]
input tag for missing node.
It looks like this input is only used for the USPS confirmation code input.
@@ -39,9 +39,14 @@ import org.apache.cloudstack.engine.subsystem.api.storage.DataStore; import org.apache.cloudstack.engine.subsystem.api.storage.StorageStrategyFactory; import org.apache.cloudstack.engine.subsystem.api.storage.VolumeInfo; import org.apache.cloudstack.framework.async.AsyncCompletionCallback; +import org.apache.log4j.Logger; +import org.springframework.stereotype.Component; import com.cloud.agent.api.to.VirtualMachineTO; import com.cloud.host.Host; +import com.cloud.storage.Volume; +import com.cloud.storage.VolumeVO; +import com.cloud.storage.dao.VolumeDao; import com.cloud.utils.StringUtils; import com.cloud.utils.exception.CloudRuntimeException;
[DataMotionServiceImpl->[copyAsync->[getDataStore,getDataMotionStrategy,getName,canCopy,getUuid,CloudRuntimeException,join,add,copyAsync,cleanUpVolumesForFailedMigrations,name,keySet],cleanUpVolumesForFailedMigrations->[setState,setRemoved,update,findById,Date,getId],getLogger]]
Package for importing data from a host. This method copies the data from one object to another object in the same order.
unecessary conlict potential
@@ -96,6 +96,7 @@ public class StateTransferLockImpl implements StateTransferLock { if (transactionDataTopologyId >= expectedTopologyId) { return CompletableFutures.completedNull(); } else { + // TODO Dan: Use thenComposeAsync to continue each command in a separate thread? return transactionDataFuture.thenCompose(nil -> transactionDataFuture(expectedTopologyId)); } }
[StateTransferLockImpl->[transactionDataFuture->[transactionDataFuture],topologyFuture->[topologyFuture]]]
Transaction data future.
No, the thread in which the execution should continue should be decided by the caller of `transactionDataFuture`.
@@ -120,7 +120,16 @@ describes.sandboxed('UrlReplacements', {}, () => { }, document: { nodeType: /* document */ 9, - querySelector: () => {return {href: canonical};}, + querySelector: selector => { + if (selector.startsWith('meta')) { + return { + getAttribute: () => {return 'https://whitelisted.com';}, + hasAttribute: () => {return true;}, + }; + } else { + return {href: canonical}; + } + }, cookie: '', }, Math: window.Math,
[No CFG could be retrieved]
Get a window object that can be used to render a single node. This function is used to test if a variable is missing.
Please add a space separated second origin.
@@ -74,6 +74,10 @@ class CarController(): # Send CAN commands. can_sends = [] + # FCW: trigger FCWAlert for 100 frames (4 seconds) + if hud_alert == VisualAlert.fcw: + self.fcw_frames = 100 + ### STEER ### if (frame % P.STEER_STEP) == 0:
[CarController->[update->[actuator_hystereses],__init__->[CarControllerParams]]]
Update the specified unknown - block in the CAN bus. no - op for both RADAR and RADAR ometer_step - Start the next CAN message.
This can be tweaked as needed. 4 seconds may be too long?
@@ -14,7 +14,10 @@ namespace System.Security.Cryptography public abstract class MD5 : HashAlgorithm { - protected MD5() { } + protected MD5() + { + HashSizeValue = 128; + } public static new MD5 Create() => new Implementation();
[MD5->[Implementation->[HashFinal->[FinalizeHashAndReset],HashCore->[AppendHashData],TryHashFinal->[TryFinalizeHashAndReset],Dispose->[Dispose],CreateHashProvider,MD5,HashSizeInBytes]]]
Create a new MD5 implementation.
I wasn't sure what to do with `HashSizeValue = ...` in `Implementation` (line 33). I left it for now, but it's probably redundant. Perhaps `Implementation` should `Debug.Assert(_hashProvider.HashSizeInBytes * 8 == HashSizeValue)`?
@@ -34,6 +34,7 @@ class Options: "show_none_errors", "warn_no_return", "warn_return_any", + "warn_unused_ignores", "ignore_errors", "strict_boolean", "no_implicit_optional",
[Options->[select_options_affecting_cache->[getattr],__init__->[OrderedDict],__repr__->[pformat,'Options],clone_for_module->[module_matches_pattern,Options,get,update],module_matches_pattern->[match]]]
Initialize a new instance of the class. This is the default behaviour for the class.
Oh, now the docs also need to be updated (it has separate sections for global and per-module flags).
@@ -65,7 +65,9 @@ public class Whiteboard extends View { private boolean mInvertedColors; private boolean mMonochrome; private boolean mUndoModeActive = false; + private int foregroundColor; + File saveWhiteboardImagFile; public Whiteboard(AbstractFlashcardViewer cardViewer, boolean inverted, boolean monochrome) { super(cardViewer, null);
[Whiteboard->[trySecondFingerScroll->[updateSecondFinger],onDraw->[onDraw],drawAbort->[undo,drawFinish],createBitmap->[createBitmap,clear],UndoStack->[size->[size],pop->[size,pop],empty->[empty],add->[add],clear->[clear]],trySecondFingerClick->[updateSecondFinger],clear->[clear]]]
Creates a Whiteboard which allows the user to draw the card s answer on the touch screen Create a white board image.
I think this was accidentally removed
@@ -614,7 +614,7 @@ class Probe $host .= ':'.$parts["port"]; } - if ($host == 'twitter.com') { + if ($host == 'twitter.com' || $host == 'mobile.twitter.com') { return self::twitter($uri); } $lrdd = self::hostMeta($host);
[Probe->[pumpioProfileData->[item,getBody,isSuccess,loadHTML],pollHcard->[query,loadHTML,getBody,isTimeout,item],getFeedLink->[query,getBody,isSuccess,loadHTML],ownHost->[getHostname],uri->[set,get],getHideStatus->[query,loadHTML,getInfo,getContentType,getBody,isSuccess],hostMeta->[getErrorNumber,get,getReturnCode,getBody,isSuccess,isTimeout],twitter->[query,getBody,isSuccess,loadHTML],feed->[getBody,isTimeout],webfinger->[getBody,isTimeout,get],pollZot->[getBody,isTimeout],pollNoscrape->[getBody,isTimeout],ostatus->[getBody,isTimeout]]]
Detects a specific user in a given URI. This function is used to find all the nodes that are not connected to the given URI. This function is used to parse a URL into a structure that can be used to build a This function is used to extract the network and baseurl from the result.
Are you sure this function also works with `mobile.twitter.com` URLs?
@@ -613,6 +613,13 @@ func (pt *programTester) testPreviewUpdateAndEdits(dir string) error { // Perform an empty preview and update; nothing is expected to happen here. if !pt.opts.Quick { + + fprintf(pt.opts.Stdout, "Roundtripping checkpoint via stack export and stack import\n") + + if err := pt.exportImport(dir); err != nil { + return err + } + msg := "" if !pt.opts.AllowEmptyUpdateChanges { msg = "(no changes expected)"
[yarnCmd->[getYarnBin],testLifeCycleDestroy->[GetDebugUpdates,runPulumiCommand],testEdit->[previewAndUpdate],pulumiCmd->[getBin,GetDebugLogLevel],performExtraRuntimeValidation->[GetStackName,runPulumiCommand],copyTestToTemporaryDirectory->[GetStackName,getBin],previewAndUpdate->[GetDebugUpdates,runPulumiCommand],prepareGoProject->[getGoBin,runCommand],prepareProjectDir->[getProjinfo,prepareProject],runPulumiCommand->[pulumiCmd,runCommand],prepareNodeJSProject->[runYarnCommand],testLifeCycleInitialize->[GetStackName,runPulumiCommand],runYarnCommand->[yarnCmd,runCommand]]
testPreviewUpdateAndEdits performs a preview and update and then runs an edits on the.
Can we do this before every edit? If I'm understanding this right this will be importing and exporting a brand-new, empty stack every time.
@@ -31,8 +31,8 @@ type ServerConfig struct { VerificationMode TLSVerificationMode `config:"verification_mode"` // one of 'none', 'full' Versions []TLSVersion `config:"supported_protocols"` CipherSuites []tlsCipherSuite `config:"cipher_suites"` - CAs []string `config:"certificate_authorities"` - Certificate CertificateConfig `config:",inline"` + CAs []string `config:"certificate_authorities" validate:"required"` + Certificate CertificateConfig `config:",inline" validate:"required"` CurveTypes []tlsCurveType `config:"curve_types"` ClientAuth tlsClientAuth `config:"client_authentication"` //`none`, `optional` or `required` }
[Validate->[Validate],Unpack->[SetString,Unpack,HasField],IsEnabled,CurveID,ClientAuthType,Err]
LoadTLSServerConfig loads a TLS config from a ServerConfig. 16 - bit integer.
I was thinking this would be conditionally required. If `client_auth` is `none` then you wouldn't need to set this.
@@ -202,6 +202,9 @@ def test_make_dics(): w = filters['weights'][0][:3] assert not np.allclose(np.diag(w.dot(w.T)), 1.0, rtol=1e-2, atol=0) + # Test whether spatial filter contains src_type + assert ('src_type' in filters.keys()) + @pytest.mark.slowtest @testing.requires_testing_data
[test_apply_dics_timeseries->[_load_forward,_simulate_data],test_make_dics->[_load_forward,_simulate_data,_test_weight_norm],test_tf_dics->[_load_forward,_simulate_data],test_apply_dics_csd->[_load_forward,_simulate_data]]
Test if a node in the network is a DICS beamformer filter. Test if a single node in the system has a . Test for missing missing values in a CSD. Dics that apply a single - channel filter to the CSD. Test the index of the missing block in the system.
`'src_type' in filters`
@@ -122,7 +122,7 @@ namespace System.Windows.Forms { for (int i = 0; i < accObjectArray.Length; i++) { - accObjectArray[i] = AsNativeAccessible(accObjectArray[i]); + accObjectArray[i] = AsNativeAccessible(accObjectArray[i])!; } }
[InternalAccessibleObject->[GetItem->[GetItem,AsNativeAccessible],get_accSelection->[AsNativeAccessible],get_accFocus->[AsNativeAccessible],Toggle->[Toggle],ScrollIntoView->[ScrollIntoView],get_accKeyboardShortcut->[get_accKeyboardShortcut],GetPropertyValue->[GetPropertyValue],accDoDefaultAction->[accDoDefaultAction],get_accRole->[get_accRole],get_accState->[get_accState],get_accDescription->[get_accDescription],set_accName->[set_accName],GetMethods->[GetMethods],RemoveFromSelection->[RemoveFromSelection],accHitTest->[AsNativeAccessible,accHitTest],GetFields->[GetFields],GetColumnHeaders->[GetColumnHeaders,AsArrayOfNativeAccessibles],SetValue->[SetValue],SetFocus->[SetFocus],Navigate->[AsNativeAccessible,Navigate],GetMember->[GetMember],set_accValue->[set_accValue],GetProperty->[GetProperty],get_accName->[get_accName],GetRowHeaderItems->[GetRowHeaderItems,AsArrayOfNativeAccessibles],InvokeMember->[InvokeMember],accLocation->[accLocation],GetMembers->[GetMembers],GetMethod->[GetMethod],get_accHelpTopic->[get_accHelpTopic],AddToSelection->[AddToSelection],accSelect->[accSelect],GetField->[GetField],ElementProviderFromPoint->[ElementProviderFromPoint,AsNativeAccessible],Select->[Select],Collapse->[Collapse],accNavigate->[accNavigate,AsNativeAccessible],get_accParent->[AsNativeAccessible],AsArrayOfNativeAccessibles->[AsNativeAccessible],get_accHelp->[get_accHelp],Invoke->[Invoke],GetProperties->[GetProperties],GetPatternProvider->[GetPatternProvider],GetEmbeddedFragmentRoots->[GetEmbeddedFragmentRoots,AsArrayOfNativeAccessibles],get_accChild->[AsNativeAccessible,get_accChild],GetFocus->[AsNativeAccessible,GetFocus],Expand->[Expand],get_accDefaultAction->[get_accDefaultAction],GetRuntimeId->[GetRuntimeId],GetSelection->[GetSelection,AsArrayOfNativeAccessibles],GetColumnHeaderItems->[GetColumnHeaderItems,AsArrayOfNativeAccessibles],DoDefaultAction->[DoDefaultAction],get_accValue->[get_accValue],GetRowHeaders->[GetRowHeaders,AsArrayOfNativeAccessibles]]]
AsArrayOfNativeAccessibles - method to check if array of native accessibles.
I'd feel better if we created a new non-nullable AsNativeAccessible and avoid the bang here. Or even better convert the existing one and force the callers to do the null check when they need to.
@@ -19,7 +19,7 @@ type PipelineJobSpecErrorsController struct { // Destroy deletes a PipelineJobSpecError record from the database, effectively // silencing the error notification func (psec *PipelineJobSpecErrorsController) Destroy(c *gin.Context) { - jobSpec := job.Job{} + jobSpec := job.SpecError{} err := jobSpec.SetID(c.Param("ID")) if err != nil { jsonAPIError(c, http.StatusUnprocessableEntity, err)
[Destroy->[JobORM,DismissError,Background,New,Param,SetID,Is]]
Destroy - Deletes a pipeline job spec error.
Was this broken or just badly named?
@@ -75,7 +75,7 @@ class ConanRunner(object): # piping both stdout, stderr and then later only reading one will hang the process # if the other fills the pip. So piping stdout, and redirecting stderr to stdout, # so both are merged and use just a single get_stream_lines() call - proc = Popen(command, shell=True, stdout=PIPE, stderr=STDOUT, cwd=cwd) + proc = Popen(command, shell=isinstance(command, six.string_types), stdout=PIPE, stderr=STDOUT, cwd=cwd) except Exception as e: raise ConanException("Error while executing '%s'\n\t%s" % (command, str(e)))
[_UnbufferedWrite->[write->[write]],ConanRunner->[__call__->[_UnbufferedWrite,write],_pipe_os_call->[get_stream_lines->[write],get_stream_lines]]]
Helper function to pipe the output of a command to the process and return the return code.
Is it always true that for command args sequences, shell should be always ``False``? From the python docs: > This can be useful if you are using Python primarily for the enhanced control flow it offers over most system shells and still want convenient access to other shell features such as shell pipes, filename wildcards, environment variable expansion, and expansion of ~ to a user's home directory. Is it possible that users want to take advantage of shell features and provide sequences of args instead of strings?
@@ -31,6 +31,8 @@ import <%= packageName %>.repository.search.UserSearchRepository; import <%= packageName %>.security.AuthoritiesConstants; <%_ if (authenticationType !== 'oauth2') { _%> import <%= packageName %>.service.MailService; +import org.springframework.data.domain.Sort; +import java.util.Collections; <%_ } _%> import <%= packageName %>.service.UserService; import <%= packageName %>.service.dto.<%= asDto('User') %>;
[No CFG could be retrieved]
Package name authenticationType databaseType databaseType Package private methods.
tests are because this import is misplaced
@@ -756,14 +756,14 @@ class Grouping < ApplicationRecord def access_repo group.access_repo do |repo| - add_starter_files(repo) + add_assignment_folder(repo) yield repo end end private - def add_starter_files(group_repo) + def add_assignment_folder(group_repo) assignment_folder = self.assignment.repository_folder # path may already exist if this is a peer review assignment. In that case do not create
[Grouping->[deletable_by?->[is_valid?],test_runs_instructors->[pluck_test_runs,group_hash_list,filter_test_runs],has_files_in_submission?->[has_submission?],due_date->[due_date],has_non_empty_submission?->[has_submission?],test_runs_students_simple->[filter_test_runs],remove_member->[membership_status],access_repo->[access_repo],remove_rejected->[membership_status],test_runs_students->[pluck_test_runs,group_hash_list,filter_test_runs],past_assessment_start_time?->[section],can_invite?->[pending?],membership_status->[membership_status],duration->[duration],test_runs_instructors_released->[pluck_test_runs,group_hash_list,filter_test_runs]]]
Add any missing files to the group repository and add any missing files to the entry.
Please update this line to yield the repo if a block is given (this should allow you to call `grouping.access_repo` instead of `grouping.access_repo {}`).
@@ -73,6 +73,16 @@ class LanguageSettingsForm extends Form { $this->setData($name, array()); } } + + // Assign DEFAULT lang display modes: + if ($this->getData('localeDisplayTitle') == null) { + $this->setData('localeDisplayTitle', 'legacy'); + } + + if ($this->getData('localeDisplayFile') == null) { + $this->setData('localeDisplayFile', 'legacy'); + } + } /**
[LanguageSettingsForm->[initData->[getData,getSetting,getPrimaryLocale,setData],readInputData->[getData,readUserVars,setData],display->[getSupportedLocaleNames,assign],execute->[updateJournal,updateSetting,setPrimaryLocale,getData,setData,getId]]]
Initializes the data.
I don't think this extra consideration of defaults is necessary since you're using the "default:" case statement when determining behavior. It's better if we don't have to maintain code in the future that's specific to an upgrade between older versions; having to mine through history to figure out why code was relevant is painful after it's been forgotten.
@@ -87,13 +87,16 @@ def make_grid(tensor, nrow=8, padding=2, return grid -def save_image(tensor, filename, nrow=8, padding=2, - normalize=False, range=None, scale_each=False, pad_value=0): +def save_image(tensor, fp, nrow=8, padding=2, + normalize=False, range=None, scale_each=False, pad_value=0, format=None): """Save a given Tensor into an image file. Args: tensor (Tensor or list): Image to be saved. If given a mini-batch tensor, saves the tensor as a grid of images by calling ``make_grid``. + fp - A filename(string) or file object + format(Optional): If omitted, the format to use is determined from the filename extension. + If a file object was used instead of a filename, this parameter should always be used. **kwargs: Other arguments are documented in ``make_grid``. """ from PIL import Image
[save_image->[make_grid],make_grid->[norm_range->[norm_ip],norm_range]]
Save a given tensor into an image file.
If we add the `format` option, shouldn't we allow all additional parameters (`dpi` for `.png`, ...)for `Image.save()`? We could add a `**kwargs` or `save_kwargs` which is simply passed to `im.save()`.
@@ -47,6 +47,8 @@ def main(unused_argv=None): cmd.append('--logdir='+logdir) cmd.append('--service_addr='+FLAGS.service_addr) cmd.append('--duration_ms='+str(FLAGS.duration_ms)) + cmd.append('--num_tracing_attempts='+str(FLAGS.num_tracing_attempts)) + cmd.append('--include_dataset_ops='+str(FLAGS.include_dataset_ops).lower()) subprocess.call(cmd)
[run_main->[run],main->[call,append,expanduser,dirname,str,join,exit,expandvars],run_main,DEFINE_string,DEFINE_integer]
Main entry point for the command line tool.
The three flags are used here. I modified the tf.flags to absl.flags for all the 5 flags, while only 2 are newly added.
@@ -83,8 +83,10 @@ public final class AkkaHttpClientInstrumentation extends Instrumenter.Default { public static class SingleRequestAdvice { @Advice.OnMethodEnter(suppress = Throwable.class) - public static SpanWithScope methodEnter( - @Advice.Argument(value = 0, readOnly = false) HttpRequest request) { + public static void methodEnter( + @Advice.Argument(value = 0, readOnly = false) HttpRequest request, + @Advice.Local("otelSpan") Span span, + @Advice.Local("otelScope") Scope scope) { /* Versions 10.0 and 10.1 have slightly different structure that is hard to distinguish so here we cast 'wider net' and avoid instrumenting twice.
[AkkaHttpClientInstrumentation->[OnCompleteHandler->[apply->[onError,onResponse,get,end,beforeFinish,isSuccess]],typeMatcher->[named],SingleRequestAdvice->[methodEnter->[withSpan,inject,withScopedContext,incrementCallDepth,onRequest,getRequest,afterStart,current,AkkaHttpHeaders,startSpan,SpanWithScope],methodExit->[getSpan,OnCompleteHandler,reset,onComplete,onError,dispatcher,end,beforeFinish,closeScope]],transformers->[getName,takesArgument,named,and,put],AkkaHttpHeaders->[set->[create,addHeader]],helperClassNames->[getName]]]
Method enter.
use `callDepth = TRACER.getCallDepth();` pattern here
@@ -356,8 +356,9 @@ class BinaryInstaller(object): raise ConanException(textwrap.dedent('''\ Missing prebuilt package for '%s' - Try to build from sources with "%s" - Or read "http://docs.conan.io/en/latest/faq/troubleshooting.html#error-missing-prebuilt-package" + Try to build from sources with '%s' + Use 'conan search <reference> --table table.html' and filter using your profile + Or read 'http://docs.conan.io/en/latest/faq/troubleshooting.html#error-missing-prebuilt-package' ''' % (missing_pkgs, build_str))) def _download(self, downloads, processed_package_refs):
[_PackageBuilder->[build_package->[_package,_build,_get_build_folder,_prepare_sources],_get_build_folder->[build_id]],build_id->[build_id],BinaryInstaller->[install->[_build],_propagate_info->[add_env_conaninfo],_download->[_download],_build->[_handle_system_requirements,_download,_raise_missing,_classify],_handle_node_cache->[_download_pkg],_build_package->[_PackageBuilder,build_package]]]
Raise an exception if missing missing - binary. Download binary packages only once for a given PREF.
And something like _"Use 'conan search ....' to check the packages that are available"_... If it works, and you filter using your profile, you will get no results
@@ -1286,9 +1286,9 @@ cont_op_hdlr(struct cmd_args_s *ap) } if (ap->pool_label) { - rc = daos_pool_connect_by_label(ap->pool_label, ap->sysname, - DAOS_PC_RW, &ap->pool, - NULL, NULL); + rc = daos_pool_connect(ap->pool_label, ap->sysname, + DAOS_PC_RW, &ap->pool, + NULL, NULL); if (rc != 0) { fprintf(stderr, "failed to connect to " "pool %s: %s (%d)\n",
[sh_op->[strcmp],int->[cont_set_attr_hdlr,cont_get_acl_hdlr,args_free,uuid_generate,cont_del_attr_hdlr,strtol,ARGS_VERIFY_PATH_NON_CREATE,cont_clone_hdlr,d_errdesc,daos_str2csumcontprop,cont_op_parse,daos_ec_cs_valid,D_STRNDUP_S,strcpy,pool_del_attr_hdlr,daos_cont_open_by_label,cont_create_uns_hdlr,cont_destroy_snap_hdlr,uuid_parse,daos_obj_id_parse,fs_copy_hdlr,pool_list_containers_hdlr,strnlen,assert,ARGS_VERIFY_OID,cont_delete_acl_hdlr,fs_dfs_hdlr,ARGS_VERIFY_PUUID,cont_list_attrs_hdlr,pool_query_hdlr,print_oclass_names_list,parse_filename_dfs,DAOS_PROP_CO_STATUS_VAL,duns_resolve_path,DP_UUID,strchr,uuid_copy,ALL_BUT_CONT_CREATE_OPTS_HELP,shell_op_parse,close,pool_op_parse,FS_COPY_CMDS_HELP,duns_destroy_attr,cont_create_snap_hdlr,daos_cont_open,D_GOTO,tobytes,cont_create_hdlr,daos_str2compresscontprop,cont_get_prop_hdlr,fprintf,cont_get_attr_hdlr,pool_get_attr_hdlr,strlen,cont_overwrite_acl_hdlr,D_FREE,daos_parse_ctype,cont_list_snaps_hdlr,ioctl,epoch_range_parse,D_STRNDUP,pool_autotest_hdlr,obj_query_hdlr,daos_parse_properties,cont_query_hdlr,obj_op_parse,uuid_is_null,cmd_args_print,strerror,FIRST_LEVEL_HELP,cont_update_acl_hdlr,obj_ctl_shell,getopt_long,daos_pool_connect,daos_cont_close,daos_str2encryptcontprop,cont_list_objs_hdlr,call_dfuse_ioctl,ARGS_VERIFY_CUUID,cont_rollback_hdlr,cont_check_hdlr,strtoull,daos_pool_connect_by_label,daos_oclass_name2id,daos_prop_alloc,pool_list_attrs_hdlr,sscanf,pool_get_prop_hdlr,cont_destroy_hdlr,D_ASPRINTF,strcmp,daos_parse_cmode,daos_parse_property,open,cont_set_owner_hdlr,cont_set_prop_hdlr,daos_pool_disconnect,pool_set_attr_hdlr,ALL_CONT_CMDS_HELP,filesystem_op_parse],cont_op->[strcmp],fs_op->[strcmp],inline->[strcasecmp],main->[daos_init,args_free,common_op_parse_hdlr,daos_fini,d_errdesc,strcmp,hdlr,printf,fprintf,help_hdlr],daos_size_t->[fprintf,strtoull],obj_op->[strcmp],void->[daos_oclass_id2name,free,D_FREE,DP_OID,daos_oclass_names_list,malloc,DP_UUID,daos_prop_free,daos_unparse_ctype,D_INFO,fprintf],pool_op->[strcmp]]
This function is called by dfuse_handle_handle to handle the cont_op of set pool and cont label or uuid network related functions function to open the neccesary container Container close in normal and error flows disconnect in normal and error flows.
with new 100 column limit (up from 80) coding coding convention we may be able to condense this to just one line. Would suggest only if there are other changes needed in this patch though.
@@ -508,8 +508,7 @@ func printPropertyValueDiff( if shouldPrintOld && shouldPrintNew { if diff.Old.IsArchive() && - diff.New.IsArchive() && - !causedReplace { + diff.New.IsArchive() { printArchiveDiff( b, titleFunc, diff.Old.ArchiveValue(), diff.New.ArchiveValue(),
[IsObject,IsAssets,ArrayValue,NewArchiveProperty,DiffLinesToChars,Assertf,BoolValue,DiffMain,IsBool,PropertyKey,IsNull,Diff,IsOutput,Strings,Color,Itoa,NewAssetProperty,IgnoreError,New,GetAssets,ParseReference,StableKeys,Assert,Len,NumberValue,Prefix,IsComputed,TypeString,TrimSpace,ID,Failf,TypeOf,IsAsset,IsText,AssetValue,GetPath,DiffCharsToLines,Split,IsString,StringValue,IsURI,IsArchive,GetURI,Sprintf,ObjectValue,IsNumber,ArchiveValue,IsArray,RawPrefix,Keys,String,WriteString,URN,MassageIfUserProgramCodeAsset]
elemTitleFunc prints the i - th element of a branch. Check if the object has a primitive value.
note: this !causedReplace line was something that was here since some of the earliest days of pulumi. I never touched it because i wasn't certain what impact it might have. Now that i've run into this issue first hand, i def think this is something we do *not* want, and we should essentially always print two archives as a diff, regardless of what the cause was.
@@ -685,7 +685,7 @@ export class ResourcesImpl { this.prerenderSize_ = this.viewer_.getPrerenderSize(); const firstPassAfterDocumentReady = - this.documentReady_ && this.firstPassAfterDocumentReady_; + this.documentReady_ && this.firstPassAfterDocumentReady_ && this.ampInitialized_; if (firstPassAfterDocumentReady) { this.firstPassAfterDocumentReady_ = false; const doc = this.win.document;
[No CFG could be retrieved]
Initializes the AMP document and schedules a pass. This method is called when a page view is not visible.
I suggested a temporary, lower-risk fix that sets `this.maybeChangeHeight_ = true` when `ampInitialized_` gets set to `true`. Could we do that just before the cut this week (and then go with this fix after)? This way we can avoid adding a behavior change on top of ~4 weeks of AMP changes to the next push.
@@ -1471,7 +1471,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator): if len(ssl_vhost.name) < (255 - (len(redirect_filename) + 1)): redirect_filename = "le-redirect-%s.conf" % ssl_vhost.name - redirect_filepath = os.path.join(self.conf("vhost-root"), + redirect_filepath = os.path.join(self.vhostroot, redirect_filename) # Register the new file that will be created
[ApacheConfigurator->[_add_servername_alias->[_add_servernames],enhance->[choose_vhost],perform->[restart,perform],make_addrs_sni_ready->[is_name_vhost,add_name_vhost],_copy_create_ssl_vhost_skeleton->[_sift_rewrite_rule],get_virtual_hosts->[_create_vhost],_add_name_vhost_if_necessary->[is_name_vhost,add_name_vhost],cleanup->[restart],_create_vhost->[_add_servernames],make_vhost_ssl->[_create_vhost],_enable_ocsp_stapling->[_escape],_find_best_vhost->[included_in_wildcard],enable_site->[is_site_enabled],_create_redirect_vhost->[_escape,_create_vhost],_has_matching_wildcard->[included_in_wildcard]]]
Write out a redirect file.
Someday I think it'd be good to be consistent here and use the same logic as we do for making SSL vhosts (use the vhostroot if it's set by the user, otherwise, use the SSL vhost path), but that doesn't (and probably shouldn't) go into this PR.
@@ -1173,7 +1173,7 @@ module.exports = class upbit extends Exchange { let feeCost = this.safeFloat (order, 'paid_fee'); let feeCurrency = undefined; const marketId = this.safeString (order, 'market'); - market = this.safeValue (this.markets_by_id, marketId); + market = this.safeMarket (marketId, market); let symbol = undefined; if (market !== undefined) { symbol = market['symbol'];
[No CFG could be retrieved]
Get order details for missing order parameters. getCostFromTrades - get cost from trades.
The lines below this one look deprecated.
@@ -75,7 +75,7 @@ namespace Microsoft.Xna.Framework #endif } - static internal string Location { get; private set; } + public static string Location { get; private set; } #if IOS static internal bool SupportRetina { get; private set; } #endif
[TitleContainer->[Stream->[Exists,IsPathRooted,Result,OpenRead,GetDirectoryName,GetFileNameWithoutExtension,Open,Combine,GetFilename,GetExtension],OpenStreamAsync->[OpenReadAsync,GetFileAsync,AsStreamForRead,Current],GetFilename->[DirectorySeparatorChar,Replace],Path,Scale,Empty,ResourcePath,BaseDirectory]]
License for Windows TitleStorage Opens a stream of the given file.
This should stay probably stay internal.
@@ -62,7 +62,12 @@ class MySales extends Component { render() { const { filter, loading, purchases } = this.state - const completedStates = ['withdrawn', 'finalized', 'sellerReviewed'] + const completedStates = [ + 'withdrawn', + 'finalized', + 'sellerReviewed', + 'ruling' + ] const filteredPurchases = purchases.filter(({ offer }) => { const completed = completedStates.includes(offer.status)
[No CFG could be retrieved]
The component that is rendered when the is updated. Displays a block of content that is either a single non - existent object or a list of.
Since this `completedStated` array is repeated a few times - maybe make it a util?
@@ -101,10 +101,9 @@ namespace System.Collections // which allows to avoid the long multiplication if the divisor is less than 2**31. Debug.Assert(divisor <= int.MaxValue); - ulong lowbits = multiplier * value; - // 64bit * 64bit => 128bit isn't currently supported by Math https://github.com/dotnet/runtime/issues/31184 - // otherwise we'd want this to be (uint)Math.BigMul(lowbits, divisor, out _) - uint highbits = (uint)((((lowbits >> 32) + 1) * divisor) >> 32); + // This is equivalent of (uint)Math.BigMul(multiplier * value, divisor, out _). This version + // is faster than BigMul currently because we only need the high bits. + uint highbits = (uint)(((((multiplier * value) >> 32) + 1) * divisor) >> 32); Debug.Assert(highbits == value % divisor); return highbits;
[HashHelpers->[ExpandPrime->[GetPrime],GetPrime->[IsPrime]]]
Fastmod implementation of the method.
This could use `(uint)Bmi2.X64.MultiplyNoFlags(multiplier * value, divisor)` directly as that overload is fast (has no memory access overhead).
@@ -367,6 +367,10 @@ func Collaboration(ctx *context.Context) { // CollaborationPost response for actions for a collaboration of a repository func CollaborationPost(ctx *context.Context) { name := strings.ToLower(ctx.Query("collaborator")) + // name may be formatted as "username (fullname)" + if strings.Contains(name, "(") && strings.HasSuffix(name, ")") { + name = strings.TrimSpace(strings.Split(name, "(")[0]) + } if len(name) == 0 || ctx.Repo.Owner.LowerName == name { ctx.Redirect(setting.AppSubURL + ctx.Req.URL.Path) return
[Handle,Now,ChangeCollaborationAccessMode,IsUserExist,GetRepositoryByName,Redirect,Info,HasPrefix,Add,NewRepoRedirect,RenderWithErr,TransferOwnership,IsErrRepoNotExist,HTML,Error,Link,CleanUpMigrateInfo,Hooks,IsErrSSHDisabled,IsErrNameReserved,ChangeRepositoryName,Trace,HasError,IsOrganization,QueryInt64,DeleteCollaboration,GetCollaborators,JSON,Tr,IsErrKeyUnableVerify,DeleteDeployKey,IsErrNamePatternNotAllowed,UpdateRepositoryUnits,IsOwnedBy,CheckPublicKeyString,ToLower,IsErrUserNotExist,GetHook,UpdateRepository,IsOwner,GetUserByName,DashboardLink,IsOrgMember,DeleteRepository,RenameRepoAction,Query,UpdateMirror,AccessMode,Update,ListDeployKeys,DeleteMirrorByRepoID,DeleteWiki,SendCollaboratorMail,AddDeployKey,IsErrKeyAlreadyExist,AddCollaborator,IsErrKeyNameAlreadyUsed,SaveAddress,IsErrRepoAlreadyExist,QueryInt,Params,ParseDuration,Success]
Collaboration render a page for a collaboration of a repository Add a collaborator to the repository.
nit: can simplify to ``` if index := strings.Index(name, " ("); index >= 0 { name = name[:index] }
@@ -80,10 +80,13 @@ namespace System.Net.Http.Functional.Tests { // These protocols are all enabled by default, so we can connect with them both when // explicitly specifying it in the client and when not. - foreach (SslProtocols protocol in new[] { SslProtocols.Tls, SslProtocols.Tls11, SslProtocols.Tls12 }) + foreach (SslProtocols protocol in Enum.GetValues(typeof(SslProtocols))) { - yield return new object[] { protocol, false }; - yield return new object[] { protocol, true }; + if (protocol != 0 && (protocol & SslProtocolSupport.SupportedSslProtocols) == protocol) + { + yield return new object[] { protocol, false }; + yield return new object[] { protocol, true }; + } } // These protocols are disabled by default, so we can only connect with them explicitly.
[HttpClientHandler_SslProtocols_Test->[DefaultProtocols_MatchesExpected->[None,SslProtocols,CreateHttpClientHandler,Equal],SupportedSSLVersionServers->[Tls,SupportsSsl3,TLSv10RemoteServer,Tls11,TLSv12RemoteServer,TLSv11RemoteServer,SSLv3RemoteServer,Tls12,Ssl3],GetAsync_AllowedSSLVersion_Succeeds_MemberData->[Tls,SupportsSsl3,Tls11,IsWindows,Tls12,IsUbuntu1810OrHigher,Ssl3,IsWindows10Version1607OrGreater,Ssl2,Tls13],SetGetProtocols_Roundtrips->[Tls,Equal,None,Tls11,SslProtocols,CreateHttpClientHandler,Tls12,Tls13],Task->[Tls,Tls12,ReadRequestHeaderAndSendResponseAsync,Tls13,SslProtocols,Assert,AcceptConnectionSendResponseAndCloseAsync,CreateHttpClient,Equal,nameof,Dispose,GetAsync,AllowAllCertificates,WhenAllCompletedOrAnyFailed,AcceptConnectionAsync,SslProtocol,Tls11,CreateHttpClientHandler,Ssl3,ServerCertificateCustomValidationCallback,CreateServerAsync,Run,Ssl2,True],NotSupportedSSLVersionServers->[SupportsSsl2,Ssl2,SSLv2RemoteServer],GetType,Linux,IsOSPlatform,Contains,Equals]]
Returns an enumerable of objects that describe whether the current SSL version is allowed by the client.
0 == None, right? Shouldn't we use None here instead of 0?
@@ -75,12 +75,17 @@ public class HiveOrcSerDeManager extends HiveSerDeManager { public static final String DEFAULT_SERDE_TYPE = "ORC"; public static final String INPUT_FORMAT_CLASS_KEY = "hiveOrcSerdeManager.inputFormatClass"; public static final String DEFAULT_INPUT_FORMAT_CLASS = OrcInputFormat.class.getName(); + public static final String WRITER_LATEST_SCHEMA = "writer.latest.schema"; public static final String OUTPUT_FORMAT_CLASS_KEY = "hiveOrcSerdeManager.outputFormatClass"; public static final String DEFAULT_OUTPUT_FORMAT_CLASS = OrcOutputFormat.class.getName(); public static final String HIVE_SPEC_SCHEMA_READING_TIMER = "hiveOrcSerdeManager.schemaReadTimer"; + public static final String HIVE_SPEC_SCHEMA_FROM_WRITER = "hiveOrcSerdeManager.getSchemaFromWriterSchema"; + public static final boolean DEFAULT_HIVE_SPEC_SCHEMA_FROM_WRITER = false; + + public static final String ENABLED_ORC_TYPE_CHECK = "hiveOrcSerdeManager.enableFormatCheck"; public static final boolean DEFAULT_ENABLED_ORC_TYPE_CHECK = false;
[HiveOrcSerDeManager->[getSchemaFromLatestFile->[getSchemaFromLatestFile],addSchemaPropertiesHelper->[getSchemaFromLatestFile]]]
The HiveSerDeManager class provides a way to manage the ORC data. Initialize the HiveSerDeManager.
Do we need to introduce a config? Can we always look for writer schema first and then default to file schema if it doesn't exist?
@@ -71,7 +71,7 @@ func ImageCache() *ICache { // Update runs only once at startup to hydrate the image cache func (ic *ICache) Update(client *client.PortLayer) error { - log.Debugf("Updating image cache...") + log.Debugf("Updating image cache") host, err := sys.UUID() if host == "" {
[GetImage->[ParseIDOrReference,NewRequestNotFoundError,RLock,Errorf,Get,RUnlock,getImageByNamed,getImageByDigest],AddImage->[Unlock,Error,WithTag,Sprintf,WithName,Name,Lock,Errorf,Tag,HasPrefix,Add],GetImages->[RUnlock,RLock],getImageByNamed->[Tag,Name],getImageByDigest->[NewRequestNotFoundError,Errorf],Update->[NewErrorWithStatusCode,CreateImageStore,NewCreateImageStoreParamsWithContext,ListImages,WithBody,UUID,Unmarshal,AddImage,Hostname,Errorf,NewListImagesParamsWithContext,WithStoreName,Debugf],NewTruncIndex,TODO]
Update updates the cache with the data from the portlayer.
Minor and feel free to ignore. This could probably be made an `Infof` message. Up to you.
@@ -129,6 +129,15 @@ def is_secret_known( return secrethash in end_state.secrethashes_to_unlockedlocks +def get_secret( + end_state: NettingChannelEndState, + secrethash: typing.SecretHash, +) -> typing.Secret: + """Returns `secret` if the `secrethash` is for a lock with a known secret.""" + if is_secret_known(end_state, secrethash): + return end_state.secrethashes_to_unlockedlocks[secrethash].secret + + def is_transaction_confirmed( transaction_block_number: typing.BlockNumber, blockchain_block_number: typing.BlockNumber,
[register_secret_endstate->[is_lock_locked],handle_block->[is_deposit_confirmed,get_status],send_refundtransfer->[create_sendlockedtransfer],handle_channel_unlock->[_del_lock,compute_proof_for_lock,is_lock_pending,get_lock,register_secret,get_status],handle_send_directtransfer->[get_status,send_directtransfer,get_distributable],send_directtransfer->[create_senddirecttransfer],get_distributable->[get_amount_locked,get_balance],create_sendlockedtransfer->[get_next_nonce,compute_merkletree_with,get_amount_locked,get_distributable,get_status],handle_channel_closed->[get_known_unlocks,get_status,set_closed],handle_channel_newbalance->[is_transaction_confirmed],create_senddirecttransfer->[get_next_nonce,get_status,get_amount_locked,get_distributable],send_unlock->[create_unlock,get_lock,_del_lock],handle_receive_refundtransfercancelroute->[handle_receive_lockedtransfer],is_valid_lockedtransfer->[is_valid_signature],create_unlock->[get_next_nonce,get_amount_locked,compute_merkletree_without,is_lock_pending,get_status],send_lockedtransfer->[create_sendlockedtransfer],is_valid_directtransfer->[is_valid_signature],handle_channel_settled->[set_settled],handle_receive_directtransfer->[get_current_balanceproof,is_valid_directtransfer],events_for_close->[get_status],_del_lock->[is_lock_pending],state_transition->[handle_action_close,handle_block,handle_channel_closed,handle_channel_unlock,handle_channel_newbalance,handle_receive_directtransfer,handle_send_directtransfer,handle_channel_settled],handle_action_close->[events_for_close],handle_unlock->[_del_lock,is_valid_unlock],register_secret->[register_secret_endstate],is_valid_unlock->[is_valid_signature],handle_receive_lockedtransfer->[is_valid_lockedtransfer]]
Checks if a transaction is confirmed.
please add the explicit `return None`
@@ -388,6 +388,10 @@ def plot_bem(subject=None, subjects_dir=None, orientation='coronal', ------- fig : Instance of matplotlib.figure.Figure The figure. + + See Also + -------- + :func:`mne.viz.plot_trans` """ subjects_dir = get_subjects_dir(subjects_dir, raise_error=True)
[plot_ideal_filter->[_get_flim,_check_fscale,adjust_axes,_filter_ticks],plot_filter->[_get_flim,_check_fscale,adjust_axes,_filter_ticks],plot_bem->[_plot_mri_contours]]
Plot a BEM contours on anatomical slices. Plots the contours of the missing contours in the BEM file.
same about func and backticks
@@ -177,7 +177,7 @@ module Engine end def can_buy_any_from_ipo?(entity) - @game.corporations.any? { |c| c.ipoed && can_buy?(entity, c.shares.first&.to_bundle) } + @game.corporations.any? { |c| c.ipoed && c.shares.any? { |s| can_buy?(entity, s&.to_bundle) } } end def can_buy_any?(entity)
[BuySellParShares->[sell_shares->[can_sell?],can_ipo_any?->[can_buy?],can_buy_any_from_market?->[can_buy?],can_sell_any?->[can_sell?],can_buy_any_from_ipo?->[can_buy?],can_buy_any?->[can_buy_any_from_ipo?,can_buy_any_from_market?],pass!->[pass!],purchasable_companies->[purchasable_companies]]]
Checks if the entity can buy any item from either the market or the item s shares.
why would s be nil?
@@ -251,10 +251,11 @@ namespace System.Text.Json.Serialization.Tests } [Fact] - [ActiveIssue("JsonElement needs to support Path")] public async Task ExtensionPropertyRoundTripFails() { - JsonException e = await Assert.ThrowsAsync<JsonException>(() => Serializer.DeserializeWrapper<Parameterized_ClassWithExtensionProperty>(@"{""MyNestedClass"":{""UnknownProperty"":bad}}")); + JsonException e = await Assert.ThrowsAsync<JsonException>(() => + Serializer.DeserializeWrapper<Parameterized_ClassWithExtensionProperty>(@"{""MyNestedClass"":{""UnknownProperty"":bad}}")); + Assert.Equal("$.MyNestedClass.UnknownProperty", e.Path); }
[No CFG could be retrieved]
A task that checks if the object is in the correct format. Reads the object from the JSON representation and checks if the object is case sensitive.
This was previously fixed -- the issue wasn't with JsonElement but in logic setting the current property.
@@ -381,8 +381,7 @@ def install_unpacked_wheel( continue elif ( is_base and - s.endswith('.dist-info') and - canonicalize_name(s).startswith(canonicalize_name(name)) + s.endswith('.dist-info') ): assert not info_dir, ( 'Multiple .dist-info directories: {}, '.format(
[_raise_for_invalid_entrypoint->[MissingCallableSuffix],install_unpacked_wheel->[record_installed->[normpath],clobber->[record_installed],open_for_csv,root_is_purelib,message_about_scripts_not_on_PATH,sorted_outrows,clobber,get_entrypoints,PipScriptMaker,get_csv_rows_for_installed],PipScriptMaker->[make->[_raise_for_invalid_entrypoint]],get_entrypoints->[_split_ep],install_wheel->[install_unpacked_wheel],get_csv_rows_for_installed->[normpath,rehash]]
Install a specific version of a unpacked wheel. Map archive RECORD paths to installation RECORD paths. pip has loaded a shared object and it can be used to create a new a Check if a is available in all data directories.
On the other end, it would now mean that a wheel can contain a single unrelated dist-info dir.
@@ -190,7 +190,7 @@ public class GameMap extends GameDataComponent implements Iterable<Territory> { private Set<Territory> getNeighbors(final Set<Territory> frontier, final Set<Territory> searched, final int distance, @Nullable final Predicate<Territory> cond) { - if (distance == 0) { + if (distance == 0 || frontier.isEmpty()) { return searched; } final Set<Territory> newFrontier = frontier.stream()
[GameMap->[getDistance->[getDistance],getWaterDistance->[getDistance],getRoute_IgnoreEnd->[getRoute],getDistance_IgnoreEndForCondition->[getDistance],getNeighbors->[getNeighbors],getNeighborsValidatingCanals->[getNeighbors],getRoute->[getRoute],getLandDistance->[getDistance],iterator->[iterator]]]
Recursive method to find neighbors in a given set.
Add empty check here as once the frontier is empty, you won't ever find any more territories to search.
@@ -83,6 +83,15 @@ namespace Dynamo.Core.Threading if (!nodeModel.IsUpdated && (!nodeModel.RequiresRecalc)) return false; // Not has not been updated at all. + // If a node is in either of the following states, then it will not + // produce any geometric output. Bail after clearing the render packages. + if ((nodeModel.State == ElementState.Error) || !nodeModel.IsVisible) + return false; + + // Without AstIdentifierForPreview, a node cannot have MirrorData. + if (string.IsNullOrEmpty(nodeModel.AstIdentifierForPreview.Value)) + return false; + drawableIds = initParams.DrawableIds; if (!drawableIds.Any()) return false; // Nothing to be drawn.
[UpdateRenderPackageAsyncTask->[AddToLabelMap->[AddToLabelMap]]]
Initialize method.
Remind me to keep it updated because I introduce a new state `ElementState.AstBuildBroken` which is a kind of error state.
@@ -675,8 +675,8 @@ public abstract class MessagingGatewaySupport extends AbstractEndpoint return reply .doOnSubscribe(s -> { - if (!error && this.countsEnabled) { - this.messageCount.incrementAndGet(); + if (!error) { + // TODO Micrometer counter } }) .<Message<?>>map(replyMessage -> {
[MessagingGatewaySupport->[doSendAndReceive->[sendAndReceive,initializeIfNecessary,getRequestChannel],sendMessageForReactiveFlow->[send],receive->[getReplyChannel,initializeIfNecessary],sendErrorMessageAndReceive->[sendAndReceive],sendAndReceiveMessageReactive->[initializeIfNecessary,getRequestChannel],handleSendError->[doSendAndReceiveMessageReactive,getErrorChannel],buildErrorMessage->[buildErrorMessage],receiveMessage->[getReplyChannel,receive,initializeIfNecessary],setShouldTrack->[setShouldTrack],registerReplyMessageCorrelatorIfNecessary->[getReplyChannel],send->[getErrorChannel,send,initializeIfNecessary,getRequestChannel],handleSendAndReceiveError->[getErrorChannel]]]
Build a reply Mono.
Probably we need to raise an issue to not forget this TODO...
@@ -65,8 +65,11 @@ int WPACKET_reserve_bytes(WPACKET *pkt, size_t len, unsigned char **allocbytes) if (BUF_MEM_grow(pkt->buf, newlen) == 0) return 0; } - if (allocbytes != NULL) + if (allocbytes != NULL) { *allocbytes = WPACKET_get_curr(pkt); + if (pkt->endfirst) + *allocbytes -= len; + } return 1; }
[WPACKET_start_sub_packet_len__->[WPACKET_allocate_bytes],WPACKET_memcpy->[WPACKET_allocate_bytes],WPACKET_init->[WPACKET_init_len],WPACKET_sub_allocate_bytes__->[WPACKET_allocate_bytes],WPACKET_start_sub_packet->[WPACKET_start_sub_packet_len__],WPACKET_memset->[WPACKET_allocate_bytes],WPACKET_sub_memcpy__->[WPACKET_close,WPACKET_memcpy,WPACKET_start_sub_packet_len__],int->[WPACKET_allocate_bytes],WPACKET_sub_reserve_bytes__->[WPACKET_reserve_bytes],WPACKET_put_bytes__->[WPACKET_allocate_bytes]]
This function is called by the next 8 - byte protocol layer to reserve some bytes of data.
Should be `if (pkt->endfirst && *allocbytes != NULL)`
@@ -543,12 +543,11 @@ public class NotebookServer extends WebSocketServlet implements public void broadcastNote(Note note) { broadcast(note.getId(), new Message(OP.NOTE).put("note", note)); + broadcastToWatchers(note.getId(), "", new Message(OP.NOTE).put("note", note)); } - public void broadcastInterpreterBindings(String noteId, - List settingList) { - broadcast(noteId, new Message(OP.INTERPRETER_BINDINGS) - .put("interpreterBindings", settingList)); + public void broadcastInterpreterBindings(String noteId, List settingList) { + broadcast(noteId, new Message(OP.INTERPRETER_BINDINGS).put("interpreterBindings", settingList)); } public void broadcastNoteList(AuthenticationInfo subject, HashSet userAndRoles) {
[NotebookServer->[multicastToUser->[serializeMessage],pushAngularObjectToRemoteRegistry->[broadcastExcept],broadcastInterpreterBindings->[broadcast],onRemove->[broadcast,notebook],generateNotesInfo->[notebook],updateParagraph->[permissionError,getOpenNoteId,broadcast],onLoad->[broadcast],unicastNoteList->[generateNotesInfo,unicast],clearAllParagraphOutput->[permissionError,broadcastNote,clearAllParagraphOutput],onOutputAppend->[broadcast],broadcastUpdateNoteJobInfo->[getKey,broadcast,notebook],broadcastNoteListExcept->[multicastToUser,generateNotesInfo],broadcast->[serializeMessage],sendNote->[permissionError,serializeMessage,addConnectionToNote],renameNote->[permissionError,broadcastNote,broadcastNoteList],broadcastExcept->[serializeMessage],insertParagraph->[permissionError,getOpenNoteId,broadcastNote,insertParagraph],checkpointNote->[checkpointNote,serializeMessage],saveInterpreterBindings->[notebook],clearParagraphOutput->[permissionError,getOpenNoteId,clearParagraphOutput,broadcastNote],onStatusChange->[broadcast],cancelParagraph->[permissionError,getOpenNoteId],ParagraphListenerImpl->[onOutputAppend->[broadcast],onProgressUpdate->[broadcast],onOutputUpdate->[broadcast],afterStatusChange->[broadcastUpdateNoteJobInfo,broadcastNote]],removeNote->[permissionError,removeNote,broadcastNoteList],unsubscribeNoteJobInfo->[getKey,removeConnectionFromNote],broadcastNote->[broadcast],removeAngularFromRemoteRegistry->[broadcastExcept],onMessage->[notebook],onUpdate->[broadcast,notebook],createNote->[broadcastNoteList,createNote,serializeMessage,addConnectionToNote],sendAllConfigurations->[serializeMessage],unicastNoteJobInfo->[getKey,serializeMessage,addConnectionToNote],broadcastNoteList->[multicastToUser,generateNotesInfo],updateNote->[permissionError,broadcastNote,broadcastNoteList],getEditorSetting->[getOpenNoteId,serializeMessage,getEditorSetting],getNoteByRevision->[getNoteByRevision,serializeMessage],broadcastToNoteBindedInterpreter->[notebook],onOutputUpdated->[broadcast],removeConnectionFromAllNote->[removeConnectionFromNote],getParagraphJobListener->[ParagraphListenerImpl],sendHomeNote->[permissionError,serializeMessage,addConnectionToNote,removeConnectionFromAllNote],angularObjectUpdated->[broadcastExcept],removeParagraph->[permissionError,getOpenNoteId,broadcastNote,removeParagraph],sendAllAngularObjects->[serializeMessage],broadcastReloadedNoteList->[multicastToUser,generateNotesInfo],getInterpreterBindings->[getInterpreterBindings,serializeMessage,notebook],moveParagraph->[permissionError,getOpenNoteId,broadcastNote,moveParagraph],permissionError->[serializeMessage],runParagraph->[permissionError,getOpenNoteId,serializeMessage,broadcast],getInterpreterSettings->[serializeMessage],unicast->[serializeMessage],completion->[getOpenNoteId,serializeMessage,completion],importNote->[broadcastNoteList,importNote,broadcastNote],pushAngularObjectToLocalRepo->[broadcastExcept],removeAngularObjectFromLocalRepo->[broadcastExcept],NotebookInformationListener->[onParagraphCreate->[getKey,broadcast,notebook],onParagraphRemove->[broadcastUpdateNoteJobInfo],onParagraphStatusChange->[getKey,broadcast,notebook],onNoteCreate->[getKey,broadcast,notebook],onUnbindInterpreter->[getKey,broadcast,notebook],onNoteRemove->[getKey,broadcastUpdateNoteJobInfo,broadcast]],getNotebookInformationListener->[NotebookInformationListener],cloneNote->[broadcastNoteList,getOpenNoteId,serializeMessage,addConnectionToNote,cloneNote],listRevisionHistory->[listRevisionHistory,serializeMessage]]]
Broadcast a note to all users who have the note.
This one also called inside of broadcast() a live above.
@@ -243,7 +243,7 @@ class DVSNIResponseTest(unittest.TestCase): 'validation': self.validation.to_json(), } - # pylint: disable=invalid-name + label1 = b'e2df3498860637c667fedadc5a8494ec' label2 = b'09dcc75553c9b3bd73662b50e71b1e42' self.z = label1 + label2
[ProofOfPossessionHintsTest->[test_from_json->[from_json,assertEqual],test_json_without_optionals->[from_json,assertEqual,to_partial_json],test_from_json_hashable->[from_json,hash],setUp->[encode_b64jose,Hints,public_key,to_json,update,copy,dump_certificate,JWKRSA],test_to_partial_json->[assertEqual,to_partial_json]],RecoveryContactTest->[test_from_json->[from_json,assertEqual],test_json_without_optionals->[assertTrue,from_json,assertEqual,to_partial_json],test_from_json_hashable->[from_json,hash],setUp->[RecoveryContact],test_to_partial_json->[assertEqual,to_partial_json]],DNSTest->[test_gen_check_validation_wrong_key->[load_vector,gen_validation,public_key,assertFalse,check_validation,load],test_from_json->[from_json,assertEqual],test_check_validation_wrong_fields->[public_key,update,sign,assertFalse,check_validation],test_check_validation_wrong_payload->[public_key,tuple,sign,assertFalse,check_validation],test_from_json_hashable->[from_json,hash],setUp->[b64decode,load_vector,DNS,load],test_gen_check_validation->[assertTrue,gen_validation,check_validation,public_key],test_to_partial_json->[assertEqual,to_partial_json],test_validation_domain_name->[assertEqual,validation_domain_name],test_gen_response->[assertTrue,assertEqual,gen_response,isinstance,patch]],DNSResponseTest->[test_from_json->[from_json,assertEqual],test_check_validation->[assertTrue,check_validation,public_key],test_from_json_hashable->[from_json,hash],setUp->[DNSResponse,to_json,DNS,sign,JWKRSA,b64decode,json_dumps],test_to_partial_json->[assertEqual,to_partial_json]],SimpleHTTPResponseTest->[test_simple_verify_good_validation->[reset_mock,assertTrue,assert_called_once_with,load_vector,simple_verify,json_dumps,uri,gen_validation,MagicMock,load],test_uri->[uri,assertEqual],test_simple_verify_connection_error->[assertFalse,simple_verify],test_simple_verify_bad_content_type->[assertFalse,mock_get,simple_verify],test_gen_check_validation_wrong_key->[load_vector,gen_validation,public_key,assertFalse,check_validation,load],test_simple_verify_port->[urlparse,assertEqual,simple_verify],test_from_json->[from_json,assertEqual],test_check_validation_wrong_fields->[load_vector,public_key,assertFalse,update,tuple,check_validation,sign,gen_resource,json_dumps,load],test_port->[assertEqual],test_check_validation_wrong_payload->[load_vector,public_key,tuple,check_validation,sign,assertFalse,json_dumps,load],setUp->[SimpleHTTPResponse,SimpleHTTP],test_simple_verify_bad_validation->[assertFalse,MagicMock,simple_verify],test_from_json_hashable->[from_json,hash],test_scheme->[assertEqual],test_gen_check_validation->[assertTrue,load_vector,public_key,gen_validation,check_validation,load],test_to_partial_json->[assertEqual,to_partial_json],patch],RecoveryContactResponseTest->[test_from_json->[from_json,assertEqual],test_json_without_optionals->[assertTrue,from_json,assertEqual,to_partial_json],test_from_json_hashable->[from_json,hash],setUp->[RecoveryContactResponse],test_to_partial_json->[assertEqual,to_partial_json]],ProofOfPossessionResponseTest->[test_from_json->[from_json,assertEqual],test_from_json_hashable->[from_json,hash],test_to_partial_json->[assertEqual,to_partial_json],setUp->[Signature,public_key,ProofOfPossessionResponse,to_json,JWKRSA],test_verify->[assertTrue,verify]],DVSNIResponseTest->[test_simple_verify_wrong_payload->[simple_verify,public_key,update,sign,assertFalse],test_probe_cert->[probe_cert,assert_called_once_with,assert_called_with],test_verify_bad_cert->[assertFalse,load_cert,verify_cert],test_simple_verify_false_on_probe_error->[assertFalse,Mock,simple_verify,public_key],test_z_and_domain->[assertEqual],test_from_json->[from_json,assertEqual],test_simple_verify->[assert_called_once_with,assertEqual,simple_verify,public_key],test_from_json_hashable->[from_json,hash],test_simple_verify_wrong_account_key->[assertFalse,simple_verify,load],test_simple_verify_wrong_token->[simple_verify,public_key,update,sign,assertFalse],test_to_partial_json->[assertEqual,to_partial_json],test_gen_verify_cert->[assertTrue,gen_cert,assertEqual,verify_cert,load_pyopenssl_private_key],setUp->[DVSNI,to_json,sign,JWKRSA,DVSNIResponse,b64decode,json_dumps],test_gen_verify_cert_gen_key->[assertTrue,gen_cert,isinstance,verify_cert],patch],SimpleHTTPTest->[test_from_json->[from_json,assertEqual],test_good_token->[assertTrue,assertFalse,update],test_from_json_hashable->[from_json,hash],setUp->[decode_b64jose,SimpleHTTP],test_to_partial_json->[assertEqual,to_partial_json]],DVSNITest->[test_from_json_invalid_token_length->[encode_b64jose,assertRaises],test_from_json->[from_json,assertEqual],test_from_json_hashable->[from_json,hash],setUp->[b64decode,DVSNI],test_to_partial_json->[assertEqual,to_partial_json],test_gen_response->[json_loads,assertEqual,gen_response,JWKRSA]],ProofOfPossessionTest->[test_from_json_hashable->[from_json,hash],test_from_json->[from_json,assertEqual],test_to_partial_json->[assertEqual,to_partial_json],setUp->[Hints,public_key,ProofOfPossession,to_json,JWKRSA]],load_rsa_private_key,main,load_cert]
Sets up the object variables.
Unnecessary newline. It looks like there are a few of these wherever `invalid-name` was the only option being temporarily disabled in `pylint`.
@@ -5,7 +5,7 @@ from allennlp.semparse.type_declarations.type_declaration import ( ComplexType, ANY_TYPE, ) -from allennlp.semparse.type_declarations.wikitables_type_declaration import ( +from allennlp.semparse.type_declarations.wikitables_lambda_dcs import ( ARG_EXTREME_TYPE, ArgExtremeType, CELL_TYPE,
[WikiTablesTypeDeclarationTest->[test_reverse_resolves_correctly->[resolve,ComplexType,ReverseType],test_arg_extreme_type_resolves_correctly->[resolve,ComplexType,ArgExtremeType],test_count_type_resolves_correctly->[resolve,ComplexType,CountType],test_conjunction_maps_to_correct_actions->[get_valid_actions]]]
This module will test for correctness of the resolution of a type declaration. Test that the given complex type is correctly representable.
Rename the file here, too.
@@ -36,6 +36,8 @@ import java.util.Set; */ public abstract class AbstractDataWriteCommand extends AbstractDataCommand implements DataWriteCommand { + protected boolean previousRead; + protected AbstractDataWriteCommand() { }
[AbstractDataWriteCommand->[isReturnValueExpected->[contains],getAffectedKeys->[singleton]]]
Abstract data write command.
Hmmmm, is this really needed? You check whether an entry has been read or not by looking into the InvocationContext. If the entry has been read, the context should have it as one of the looked up entry. Also, anything you add to a command is generally because it needs to be replicated. Do you really need to replicate this information to other nodes?
@@ -267,7 +267,7 @@ export class WebLoginDialog { } dev().fine(TAG, 'Login done: ', result, opt_error); if (opt_error) { - this.reject_(opt_error); + this.reject_(user().createError('login failed', opt_error)); } else { this.resolve_(result); }
[No CFG could be retrieved]
This function is called when AMP login is done. It sets up the event listeners and Builds the login url for a specific .
Why? Isn't error already marked correctly as a user error or dev error? Why are we changing the meaning of the error?