code stringlengths 25 201k | docstring stringlengths 19 96.2k | func_name stringlengths 0 235 | language stringclasses 1 value | repo stringlengths 8 51 | path stringlengths 11 314 | url stringlengths 62 377 | license stringclasses 7 values |
|---|---|---|---|---|---|---|---|
public <R> SingleOutputStreamOperator<R> select(
final PatternSelectFunction<T, R> patternSelectFunction) {
// we have to extract the output type from the provided pattern selection function manually
// because the TypeExtractor cannot do that if the method is wrapped in a MapFunction
final TypeInformation<R> returnType =
TypeExtractor.getUnaryOperatorReturnType(
patternSelectFunction,
PatternSelectFunction.class,
0,
1,
TypeExtractor.NO_INDEX,
builder.getInputType(),
null,
false);
return select(patternSelectFunction, returnType);
} | Applies a select function to the detected pattern sequence. For each pattern sequence the
provided {@link PatternSelectFunction} is called. The pattern select function can produce
exactly one resulting element.
@param patternSelectFunction The pattern select function which is called for each detected
pattern sequence.
@param <R> Type of the resulting elements
@return {@link DataStream} which contains the resulting elements from the pattern select
function. | select | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/PatternStream.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/PatternStream.java | Apache-2.0 |
public <R> SingleOutputStreamOperator<R> select(
final PatternSelectFunction<T, R> patternSelectFunction,
final TypeInformation<R> outTypeInfo) {
final PatternProcessFunction<T, R> processFunction =
fromSelect(builder.clean(patternSelectFunction)).build();
return process(processFunction, outTypeInfo);
} | Applies a select function to the detected pattern sequence. For each pattern sequence the
provided {@link PatternSelectFunction} is called. The pattern select function can produce
exactly one resulting element.
@param patternSelectFunction The pattern select function which is called for each detected
pattern sequence.
@param <R> Type of the resulting elements
@param outTypeInfo Explicit specification of output type.
@return {@link DataStream} which contains the resulting elements from the pattern select
function. | select | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/PatternStream.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/PatternStream.java | Apache-2.0 |
public <L, R> SingleOutputStreamOperator<R> select(
final OutputTag<L> timedOutPartialMatchesTag,
final PatternTimeoutFunction<T, L> patternTimeoutFunction,
final PatternSelectFunction<T, R> patternSelectFunction) {
final TypeInformation<R> rightTypeInfo =
TypeExtractor.getUnaryOperatorReturnType(
patternSelectFunction,
PatternSelectFunction.class,
0,
1,
TypeExtractor.NO_INDEX,
builder.getInputType(),
null,
false);
return select(
timedOutPartialMatchesTag,
patternTimeoutFunction,
rightTypeInfo,
patternSelectFunction);
} | Applies a select function to the detected pattern sequence. For each pattern sequence the
provided {@link PatternSelectFunction} is called. The pattern select function can produce
exactly one resulting element.
<p>Applies a timeout function to a partial pattern sequence which has timed out. For each
partial pattern sequence the provided {@link PatternTimeoutFunction} is called. The pattern
timeout function can produce exactly one resulting element.
<p>You can get the stream of timed-out data resulting from the {@link
SingleOutputStreamOperator#getSideOutput(OutputTag)} on the {@link
SingleOutputStreamOperator} resulting from the select operation with the same {@link
OutputTag}.
@param timedOutPartialMatchesTag {@link OutputTag} that identifies side output with timed out
patterns
@param patternTimeoutFunction The pattern timeout function which is called for each partial
pattern sequence which has timed out.
@param patternSelectFunction The pattern select function which is called for each detected
pattern sequence.
@param <L> Type of the resulting timeout elements
@param <R> Type of the resulting elements
@return {@link DataStream} which contains the resulting elements with the resulting timeout
elements in a side output. | select | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/PatternStream.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/PatternStream.java | Apache-2.0 |
public <L, R> SingleOutputStreamOperator<R> select(
final OutputTag<L> timedOutPartialMatchesTag,
final PatternTimeoutFunction<T, L> patternTimeoutFunction,
final TypeInformation<R> outTypeInfo,
final PatternSelectFunction<T, R> patternSelectFunction) {
final PatternProcessFunction<T, R> processFunction =
fromSelect(builder.clean(patternSelectFunction))
.withTimeoutHandler(
timedOutPartialMatchesTag, builder.clean(patternTimeoutFunction))
.build();
return process(processFunction, outTypeInfo);
} | Applies a select function to the detected pattern sequence. For each pattern sequence the
provided {@link PatternSelectFunction} is called. The pattern select function can produce
exactly one resulting element.
<p>Applies a timeout function to a partial pattern sequence which has timed out. For each
partial pattern sequence the provided {@link PatternTimeoutFunction} is called. The pattern
timeout function can produce exactly one resulting element.
<p>You can get the stream of timed-out data resulting from the {@link
SingleOutputStreamOperator#getSideOutput(OutputTag)} on the {@link
SingleOutputStreamOperator} resulting from the select operation with the same {@link
OutputTag}.
@param timedOutPartialMatchesTag {@link OutputTag} that identifies side output with timed out
patterns
@param patternTimeoutFunction The pattern timeout function which is called for each partial
pattern sequence which has timed out.
@param outTypeInfo Explicit specification of output type.
@param patternSelectFunction The pattern select function which is called for each detected
pattern sequence.
@param <L> Type of the resulting timeout elements
@param <R> Type of the resulting elements
@return {@link DataStream} which contains the resulting elements with the resulting timeout
elements in a side output. | select | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/PatternStream.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/PatternStream.java | Apache-2.0 |
public <R> SingleOutputStreamOperator<R> flatSelect(
final PatternFlatSelectFunction<T, R> patternFlatSelectFunction) {
// we have to extract the output type from the provided pattern selection function manually
// because the TypeExtractor cannot do that if the method is wrapped in a MapFunction
final TypeInformation<R> outTypeInfo =
TypeExtractor.getUnaryOperatorReturnType(
patternFlatSelectFunction,
PatternFlatSelectFunction.class,
0,
1,
new int[] {1, 0},
builder.getInputType(),
null,
false);
return flatSelect(patternFlatSelectFunction, outTypeInfo);
} | Applies a flat select function to the detected pattern sequence. For each pattern sequence
the provided {@link PatternFlatSelectFunction} is called. The pattern flat select function
can produce an arbitrary number of resulting elements.
@param patternFlatSelectFunction The pattern flat select function which is called for each
detected pattern sequence.
@param <R> Type of the resulting elements
@return {@link DataStream} which contains the resulting elements from the pattern flat select
function. | flatSelect | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/PatternStream.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/PatternStream.java | Apache-2.0 |
public <R> SingleOutputStreamOperator<R> flatSelect(
final PatternFlatSelectFunction<T, R> patternFlatSelectFunction,
final TypeInformation<R> outTypeInfo) {
final PatternProcessFunction<T, R> processFunction =
fromFlatSelect(builder.clean(patternFlatSelectFunction)).build();
return process(processFunction, outTypeInfo);
} | Applies a flat select function to the detected pattern sequence. For each pattern sequence
the provided {@link PatternFlatSelectFunction} is called. The pattern flat select function
can produce an arbitrary number of resulting elements.
@param patternFlatSelectFunction The pattern flat select function which is called for each
detected pattern sequence.
@param <R> Type of the resulting elements
@param outTypeInfo Explicit specification of output type.
@return {@link DataStream} which contains the resulting elements from the pattern flat select
function. | flatSelect | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/PatternStream.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/PatternStream.java | Apache-2.0 |
public <L, R> SingleOutputStreamOperator<R> flatSelect(
final OutputTag<L> timedOutPartialMatchesTag,
final PatternFlatTimeoutFunction<T, L> patternFlatTimeoutFunction,
final PatternFlatSelectFunction<T, R> patternFlatSelectFunction) {
final TypeInformation<R> rightTypeInfo =
TypeExtractor.getUnaryOperatorReturnType(
patternFlatSelectFunction,
PatternFlatSelectFunction.class,
0,
1,
new int[] {1, 0},
builder.getInputType(),
null,
false);
return flatSelect(
timedOutPartialMatchesTag,
patternFlatTimeoutFunction,
rightTypeInfo,
patternFlatSelectFunction);
} | Applies a flat select function to the detected pattern sequence. For each pattern sequence
the provided {@link PatternFlatSelectFunction} is called. The pattern select function can
produce exactly one resulting element.
<p>Applies a timeout function to a partial pattern sequence which has timed out. For each
partial pattern sequence the provided {@link PatternFlatTimeoutFunction} is called. The
pattern timeout function can produce exactly one resulting element.
<p>You can get the stream of timed-out data resulting from the {@link
SingleOutputStreamOperator#getSideOutput(OutputTag)} on the {@link
SingleOutputStreamOperator} resulting from the select operation with the same {@link
OutputTag}.
@param timedOutPartialMatchesTag {@link OutputTag} that identifies side output with timed out
patterns
@param patternFlatTimeoutFunction The pattern timeout function which is called for each
partial pattern sequence which has timed out.
@param patternFlatSelectFunction The pattern select function which is called for each
detected pattern sequence.
@param <L> Type of the resulting timeout elements
@param <R> Type of the resulting elements
@return {@link DataStream} which contains the resulting elements with the resulting timeout
elements in a side output. | flatSelect | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/PatternStream.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/PatternStream.java | Apache-2.0 |
public <L, R> SingleOutputStreamOperator<R> flatSelect(
final OutputTag<L> timedOutPartialMatchesTag,
final PatternFlatTimeoutFunction<T, L> patternFlatTimeoutFunction,
final TypeInformation<R> outTypeInfo,
final PatternFlatSelectFunction<T, R> patternFlatSelectFunction) {
final PatternProcessFunction<T, R> processFunction =
fromFlatSelect(builder.clean(patternFlatSelectFunction))
.withTimeoutHandler(
timedOutPartialMatchesTag,
builder.clean(patternFlatTimeoutFunction))
.build();
return process(processFunction, outTypeInfo);
} | Applies a flat select function to the detected pattern sequence. For each pattern sequence
the provided {@link PatternFlatSelectFunction} is called. The pattern select function can
produce exactly one resulting element.
<p>Applies a timeout function to a partial pattern sequence which has timed out. For each
partial pattern sequence the provided {@link PatternFlatTimeoutFunction} is called. The
pattern timeout function can produce exactly one resulting element.
<p>You can get the stream of timed-out data resulting from the {@link
SingleOutputStreamOperator#getSideOutput(OutputTag)} on the {@link
SingleOutputStreamOperator} resulting from the select operation with the same {@link
OutputTag}.
@param timedOutPartialMatchesTag {@link OutputTag} that identifies side output with timed out
patterns
@param patternFlatTimeoutFunction The pattern timeout function which is called for each
partial pattern sequence which has timed out.
@param patternFlatSelectFunction The pattern select function which is called for each
detected pattern sequence.
@param outTypeInfo Explicit specification of output type.
@param <L> Type of the resulting timeout elements
@param <R> Type of the resulting elements
@return {@link DataStream} which contains the resulting elements with the resulting timeout
elements in a side output. | flatSelect | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/PatternStream.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/PatternStream.java | Apache-2.0 |
TypeInformation<IN> getInputType() {
return inputStream.getType();
} | The time behaviour enum defines how the system determines time for time-dependent order and
operations that depend on time. | getInputType | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/PatternStreamBuilder.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/PatternStreamBuilder.java | Apache-2.0 |
public DeweyNumber increase() {
return increase(1);
} | Creates a new dewey number from this such that its last digit is increased by one.
@return A new dewey number derived from this whose last digit is increased by one | increase | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/DeweyNumber.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/DeweyNumber.java | Apache-2.0 |
public DeweyNumber increase(int times) {
int[] newDeweyNumber = Arrays.copyOf(deweyNumber, deweyNumber.length);
newDeweyNumber[deweyNumber.length - 1] += times;
return new DeweyNumber(newDeweyNumber);
} | Creates a new dewey number from this such that its last digit is increased by the supplied
number.
@param times how many times to increase the Dewey number
@return A new dewey number derived from this whose last digit is increased by given number | increase | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/DeweyNumber.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/DeweyNumber.java | Apache-2.0 |
public static DeweyNumber fromString(final String deweyNumberString) {
String[] splits = deweyNumberString.split("\\.");
if (splits.length == 1) {
return new DeweyNumber(Integer.parseInt(deweyNumberString));
} else if (splits.length > 0) {
int[] deweyNumber = new int[splits.length];
for (int i = 0; i < splits.length; i++) {
deweyNumber[i] = Integer.parseInt(splits[i]);
}
return new DeweyNumber(deweyNumber);
} else {
throw new IllegalArgumentException(
"Failed to parse " + deweyNumberString + " as a Dewey number");
}
} | Creates a dewey number from a string representation. The input string must be a dot separated
string of integers.
@param deweyNumberString Dot separated string of integers
@return Dewey number generated from the given input string | fromString | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/DeweyNumber.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/DeweyNumber.java | Apache-2.0 |
public void open(RuntimeContext cepRuntimeContext, Configuration conf) throws Exception {
for (State<T> state : getStates()) {
for (StateTransition<T> transition : state.getStateTransitions()) {
IterativeCondition condition = transition.getCondition();
FunctionUtils.setFunctionRuntimeContext(condition, cepRuntimeContext);
FunctionUtils.openFunction(condition, DefaultOpenContext.INSTANCE);
}
}
} | Initialization method for the NFA. It is called before any element is passed and thus
suitable for one time setup work.
@param cepRuntimeContext runtime context of the enclosing operator
@param conf The configuration containing the parameters attached to the contract. | open | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/NFA.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/NFA.java | Apache-2.0 |
public void close() throws Exception {
for (State<T> state : getStates()) {
for (StateTransition<T> transition : state.getStateTransitions()) {
IterativeCondition condition = transition.getCondition();
FunctionUtils.closeFunction(condition);
}
}
} | Tear-down method for the NFA. | close | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/NFA.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/NFA.java | Apache-2.0 |
public Collection<Map<String, List<T>>> process(
final SharedBufferAccessor<T> sharedBufferAccessor,
final NFAState nfaState,
final T event,
final long timestamp,
final AfterMatchSkipStrategy afterMatchSkipStrategy,
final TimerService timerService)
throws Exception {
try (EventWrapper eventWrapper = new EventWrapper(event, timestamp, sharedBufferAccessor)) {
return doProcess(
sharedBufferAccessor,
nfaState,
eventWrapper,
afterMatchSkipStrategy,
timerService);
}
} | Processes the next input event. If some of the computations reach a final state then the
resulting event sequences are returned. If computations time out and timeout handling is
activated, then the timed out event patterns are returned.
<p>If computations reach a stop state, the path forward is discarded and currently
constructed path is returned with the element that resulted in the stop state.
@param sharedBufferAccessor the accessor to SharedBuffer object that we need to work upon
while processing
@param nfaState The NFAState object that we need to affect while processing
@param event The current event to be processed or null if only pruning shall be done
@param timestamp The timestamp of the current event
@param afterMatchSkipStrategy The skip strategy to use after per match
@param timerService gives access to processing time and time characteristic, needed for
condition evaluation
@return Tuple of the collection of matched patterns (e.g. the result of computations which
have reached a final state) and the collection of timed out patterns (if timeout handling
is activated)
@throws Exception Thrown if the system cannot access the state. | process | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/NFA.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/NFA.java | Apache-2.0 |
private Map<String, List<EventId>> extractCurrentMatches(
final SharedBufferAccessor<T> sharedBufferAccessor,
final ComputationState computationState)
throws Exception {
if (computationState.getPreviousBufferEntry() == null) {
return new HashMap<>();
}
List<Map<String, List<EventId>>> paths =
sharedBufferAccessor.extractPatterns(
computationState.getPreviousBufferEntry(), computationState.getVersion());
if (paths.isEmpty()) {
return new HashMap<>();
}
// for a given computation state, we cannot have more than one matching patterns.
Preconditions.checkState(paths.size() == 1);
return paths.get(0);
} | Extracts all the sequences of events from the start to the given computation state. An event
sequence is returned as a map which contains the events and the names of the states to which
the events were mapped.
@param sharedBufferAccessor The accessor to {@link SharedBuffer} from which to extract the
matches
@param computationState The end computation state of the extracted event sequences
@return Collection of event sequences which end in the given computation state
@throws Exception Thrown if the system cannot access the state. | extractCurrentMatches | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/NFA.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/NFA.java | Apache-2.0 |
public boolean isStateChanged() {
return stateChanged;
} | Check if the matching status of the NFA has changed so far.
@return {@code true} if matching status has changed, {@code false} otherwise | isStateChanged | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/NFAState.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/NFAState.java | Apache-2.0 |
public void resetStateChanged() {
this.stateChanged = false;
} | Reset the changed bit checked via {@link #isStateChanged()} to {@code false}. | resetStateChanged | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/NFAState.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/NFAState.java | Apache-2.0 |
public void prune(
Collection<ComputationState> matchesToPrune,
Collection<Map<String, List<EventId>>> matchedResult,
SharedBufferAccessor<?> sharedBufferAccessor)
throws Exception {
if (!isSkipStrategy()) {
return;
}
EventId pruningId = getPruningId(matchedResult);
if (pruningId != null) {
List<ComputationState> discardStates = new ArrayList<>();
for (ComputationState computationState : matchesToPrune) {
if (computationState.getStartEventID() != null
&& shouldPrune(computationState.getStartEventID(), pruningId)) {
sharedBufferAccessor.releaseNode(
computationState.getPreviousBufferEntry(),
computationState.getVersion());
discardStates.add(computationState);
}
}
matchesToPrune.removeAll(discardStates);
}
} | Prunes matches/partial matches based on the chosen strategy.
@param matchesToPrune current partial matches
@param matchedResult already completed matches
@param sharedBufferAccessor accessor to corresponding shared buffer
@throws Exception thrown if could not access the state | prune | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/aftermatch/AfterMatchSkipStrategy.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/aftermatch/AfterMatchSkipStrategy.java | Apache-2.0 |
@SuppressWarnings("unchecked")
public static <T> NFAFactory<T> compileFactory(
final Pattern<T, ?> pattern, boolean timeoutHandling) {
if (pattern == null) {
// return a factory for empty NFAs
return new NFAFactoryImpl<>(
0,
Collections.<String, Long>emptyMap(),
Collections.<State<T>>emptyList(),
timeoutHandling);
} else {
final NFAFactoryCompiler<T> nfaFactoryCompiler = new NFAFactoryCompiler<>(pattern);
nfaFactoryCompiler.compileFactory();
return new NFAFactoryImpl<>(
nfaFactoryCompiler.getWindowTime(),
nfaFactoryCompiler.getWindowTimes(),
nfaFactoryCompiler.getStates(),
timeoutHandling);
}
} | Compiles the given pattern into a {@link NFAFactory}. The NFA factory can be used to create
multiple NFAs.
@param pattern Definition of sequence pattern
@param timeoutHandling True if the NFA shall return timed out event patterns
@param <T> Type of the input events
@return Factory for NFAs corresponding to the given pattern | compileFactory | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
public static boolean canProduceEmptyMatches(final Pattern<?, ?> pattern) {
NFAFactoryCompiler<?> compiler = new NFAFactoryCompiler<>(checkNotNull(pattern));
compiler.compileFactory();
State<?> startState =
compiler.getStates().stream()
.filter(State::isStart)
.findFirst()
.orElseThrow(
() ->
new IllegalStateException(
"Compiler produced no start state. It is a bug. File a jira."));
Set<State<?>> visitedStates = new HashSet<>();
final Stack<State<?>> statesToCheck = new Stack<>();
statesToCheck.push(startState);
while (!statesToCheck.isEmpty()) {
final State<?> currentState = statesToCheck.pop();
if (visitedStates.contains(currentState)) {
continue;
} else {
visitedStates.add(currentState);
}
for (StateTransition<?> transition : currentState.getStateTransitions()) {
if (transition.getAction() == StateTransitionAction.PROCEED) {
if (transition.getTargetState().isFinal()) {
return true;
} else {
statesToCheck.push(transition.getTargetState());
}
}
}
}
return false;
} | Verifies if the provided pattern can possibly generate empty match. Example of patterns that
can possibly generate empty matches are: A*, A?, A* B? etc.
@param pattern pattern to check
@return true if empty match could potentially match the pattern, false otherwise | canProduceEmptyMatches | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
private void checkPatternWindowTimes() {
windowTime.ifPresent(
windowTime -> {
if (windowTimes.values().stream().anyMatch(time -> time > windowTime)) {
throw new MalformedPatternException(
"The window length between the previous and current event cannot be larger than the window length between the first and last event for a Pattern.");
}
});
} | Check pattern window times between events. | checkPatternWindowTimes | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
private void checkPatternSkipStrategy() {
if (afterMatchSkipStrategy.getPatternName().isPresent()) {
String patternName = afterMatchSkipStrategy.getPatternName().get();
Pattern<T, ?> pattern = currentPattern;
while (pattern.getPrevious() != null && !pattern.getName().equals(patternName)) {
pattern = pattern.getPrevious();
}
// pattern name match check.
if (!pattern.getName().equals(patternName)) {
throw new MalformedPatternException(
"The pattern name specified in AfterMatchSkipStrategy "
+ "can not be found in the given Pattern");
}
}
} | Check pattern after match skip strategy. | checkPatternSkipStrategy | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
private void checkPatternNameUniqueness() {
// make sure there is no pattern with name "$endState$"
stateNameHandler.checkNameUniqueness(ENDING_STATE_NAME);
Pattern patternToCheck = currentPattern;
while (patternToCheck != null) {
checkPatternNameUniqueness(patternToCheck);
patternToCheck = patternToCheck.getPrevious();
}
stateNameHandler.clear();
} | Check if there are duplicate pattern names. If yes, it throws a {@link
MalformedPatternException}. | checkPatternNameUniqueness | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
private void checkPatternNameUniqueness(final Pattern pattern) {
if (pattern instanceof GroupPattern) {
Pattern patternToCheck = ((GroupPattern) pattern).getRawPattern();
while (patternToCheck != null) {
checkPatternNameUniqueness(patternToCheck);
patternToCheck = patternToCheck.getPrevious();
}
} else {
stateNameHandler.checkNameUniqueness(pattern.getName());
}
} | Check if the given pattern's name is already used or not. If yes, it throws a {@link
MalformedPatternException}.
@param pattern The pattern to be checked | checkPatternNameUniqueness | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
private List<Tuple2<IterativeCondition<T>, String>> getCurrentNotCondition() {
List<Tuple2<IterativeCondition<T>, String>> notConditions = new ArrayList<>();
Pattern<T, ? extends T> previousPattern = currentPattern;
while (previousPattern.getPrevious() != null
&& (previousPattern
.getPrevious()
.getQuantifier()
.hasProperty(Quantifier.QuantifierProperty.OPTIONAL)
|| previousPattern.getPrevious().getQuantifier().getConsumingStrategy()
== Quantifier.ConsumingStrategy.NOT_FOLLOW)) {
previousPattern = previousPattern.getPrevious();
if (previousPattern.getQuantifier().getConsumingStrategy()
== Quantifier.ConsumingStrategy.NOT_FOLLOW) {
final IterativeCondition<T> notCondition = getTakeCondition(previousPattern);
notConditions.add(Tuple2.of(notCondition, previousPattern.getName()));
}
}
return notConditions;
} | Retrieves list of conditions resulting in Stop state and names of the corresponding NOT
patterns.
<p>A current not condition can be produced in two cases:
<ol>
<li>the previous pattern is a {@link Quantifier.ConsumingStrategy#NOT_FOLLOW}
<li>exists a backward path of {@link Quantifier.QuantifierProperty#OPTIONAL} patterns
to {@link Quantifier.ConsumingStrategy#NOT_FOLLOW}
</ol>
<p><b>WARNING:</b> for more info on the second case see: {@link
NFAFactoryCompiler#copyWithoutTransitiveNots(State)}
@return list of not conditions with corresponding names | getCurrentNotCondition | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
private State<T> createEndingState() {
State<T> endState = createState(ENDING_STATE_NAME, State.StateType.Final);
windowTime = currentPattern.getWindowSize().map(Duration::toMillis);
return endState;
} | Creates the dummy Final {@link State} of the NFA graph.
@return dummy Final state | createEndingState | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
private State<T> createMiddleStates(final State<T> sinkState) {
State<T> lastSink = sinkState;
while (currentPattern.getPrevious() != null) {
if (currentPattern.getQuantifier().getConsumingStrategy()
== Quantifier.ConsumingStrategy.NOT_FOLLOW) {
// skip notFollow patterns, they are converted into edge conditions
if ((currentPattern.getWindowSize(WithinType.PREVIOUS_AND_CURRENT).isPresent()
|| getWindowTime() > 0)
&& lastSink.isFinal()) {
final State<T> notFollow = createState(State.StateType.Pending, true);
final IterativeCondition<T> notCondition = getTakeCondition(currentPattern);
final State<T> stopState =
createStopState(notCondition, currentPattern.getName());
notFollow.addProceed(stopState, notCondition);
notFollow.addIgnore(new RichNotCondition<>(notCondition));
lastSink = notFollow;
}
} else if (currentPattern.getQuantifier().getConsumingStrategy()
== Quantifier.ConsumingStrategy.NOT_NEXT) {
final State<T> notNext = createState(State.StateType.Normal, true);
final IterativeCondition<T> notCondition = getTakeCondition(currentPattern);
final State<T> stopState =
createStopState(notCondition, currentPattern.getName());
if (lastSink.isFinal()) {
// so that the proceed to final is not fired
notNext.addIgnore(lastSink, new RichNotCondition<>(notCondition));
} else {
notNext.addProceed(lastSink, new RichNotCondition<>(notCondition));
}
notNext.addProceed(stopState, notCondition);
lastSink = notNext;
} else {
lastSink = convertPattern(lastSink);
}
// we traverse the pattern graph backwards
followingPattern = currentPattern;
currentPattern = currentPattern.getPrevious();
// the window time is the global minimum of all window times of each state
currentPattern
.getWindowSize()
.map(Duration::toMillis)
.filter(
windowSizeInMillis ->
windowSizeInMillis < windowTime.orElse(Long.MAX_VALUE))
.ifPresent(
windowSizeInMillis -> windowTime = Optional.of(windowSizeInMillis));
}
return lastSink;
} | Creates all the states between Start and Final state.
@param sinkState the state that last state should point to (always the Final state)
@return the next state after Start in the resulting graph | createMiddleStates | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
@SuppressWarnings("unchecked")
private State<T> createTimesState(
final State<T> sinkState, final State<T> proceedState, Times times) {
State<T> lastSink = sinkState;
setCurrentGroupPatternFirstOfLoop(false);
final IterativeCondition<T> untilCondition =
(IterativeCondition<T>) currentPattern.getUntilCondition();
final IterativeCondition<T> innerIgnoreCondition =
extendWithUntilCondition(
getInnerIgnoreCondition(currentPattern), untilCondition, false);
final IterativeCondition<T> takeCondition =
extendWithUntilCondition(
getTakeCondition(currentPattern), untilCondition, true);
if (currentPattern.getQuantifier().hasProperty(Quantifier.QuantifierProperty.GREEDY)
&& times.getFrom() != times.getTo()) {
if (untilCondition != null) {
State<T> sinkStateCopy = copy(sinkState);
originalStateMap.put(sinkState.getName(), sinkStateCopy);
}
updateWithGreedyCondition(sinkState, takeCondition);
}
for (int i = times.getFrom(); i < times.getTo(); i++) {
lastSink =
createSingletonState(
lastSink, proceedState, takeCondition, innerIgnoreCondition, true);
addStopStateToLooping(lastSink);
}
for (int i = 0; i < times.getFrom() - 1; i++) {
lastSink =
createSingletonState(
lastSink, null, takeCondition, innerIgnoreCondition, false);
addStopStateToLooping(lastSink);
}
// we created the intermediate states in the loop, now we create the start of the loop.
setCurrentGroupPatternFirstOfLoop(true);
return createSingletonState(
lastSink,
proceedState,
takeCondition,
getIgnoreCondition(currentPattern),
isPatternOptional(currentPattern));
} | Creates a "complex" state consisting of given number of states with same {@link
IterativeCondition}.
@param sinkState the state that the created state should point to
@param proceedState state that the state being converted should proceed to
@param times number of times the state should be copied
@return the first state of the "complex" state, next state should point to it | createTimesState | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
@SuppressWarnings("unchecked")
private void setCurrentGroupPatternFirstOfLoop(boolean isFirstOfLoop) {
if (currentPattern instanceof GroupPattern) {
firstOfLoopMap.put((GroupPattern<T, ?>) currentPattern, isFirstOfLoop);
}
} | Marks the current group pattern as the head of the TIMES quantifier or not.
@param isFirstOfLoop whether the current group pattern is the head of the TIMES
quantifier | setCurrentGroupPatternFirstOfLoop | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
private boolean isCurrentGroupPatternFirstOfLoop() {
if (firstOfLoopMap.containsKey(currentGroupPattern)) {
return firstOfLoopMap.get(currentGroupPattern);
} else {
return true;
}
} | Checks if the current group pattern is the head of the TIMES/LOOPING quantifier or not a
TIMES/LOOPING quantifier pattern. | isCurrentGroupPatternFirstOfLoop | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
private boolean headOfGroup(Pattern<T, ?> pattern) {
return currentGroupPattern != null && pattern.getPrevious() == null;
} | Checks if the given pattern is the head pattern of the current group pattern.
@param pattern the pattern to be checked
@return {@code true} iff the given pattern is in a group pattern and it is the head
pattern of the group pattern, {@code false} otherwise | headOfGroup | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
@SuppressWarnings("unchecked")
private State<T> createSingletonState(final State<T> sinkState) {
return createSingletonState(
sinkState,
sinkState,
getTakeCondition(currentPattern),
getIgnoreCondition(currentPattern),
isPatternOptional(currentPattern));
} | Creates a simple single state. For an OPTIONAL state it also consists of a similar state
without the PROCEED edge, so that for each PROCEED transition branches in computation
state graph can be created only once.
@param sinkState state that the state being converted should point to
@return the created state | createSingletonState | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
@SuppressWarnings("unchecked")
private State<T> createSingletonState(
final State<T> sinkState,
final State<T> proceedState,
final IterativeCondition<T> takeCondition,
final IterativeCondition<T> ignoreCondition,
final boolean isOptional) {
if (currentPattern instanceof GroupPattern) {
return createGroupPatternState(
(GroupPattern) currentPattern, sinkState, proceedState, isOptional);
}
final State<T> singletonState = createState(State.StateType.Normal, true);
// if event is accepted then all notPatterns previous to the optional states are no
// longer valid
final State<T> sink = copyWithoutTransitiveNots(sinkState);
singletonState.addTake(sink, takeCondition);
// if no element accepted the previous nots are still valid.
final IterativeCondition<T> proceedCondition = getTrueFunction();
if (isOptional) {
if (currentPattern
.getQuantifier()
.hasProperty(Quantifier.QuantifierProperty.GREEDY)) {
final IterativeCondition<T> untilCondition =
(IterativeCondition<T>) currentPattern.getUntilCondition();
if (untilCondition != null) {
singletonState.addProceed(
originalStateMap.get(proceedState.getName()),
new RichAndCondition<>(proceedCondition, untilCondition));
}
singletonState.addProceed(
proceedState,
untilCondition != null
? new RichAndCondition<>(
proceedCondition,
new RichNotCondition<>(untilCondition))
: proceedCondition);
} else {
singletonState.addProceed(proceedState, proceedCondition);
}
}
if (ignoreCondition != null) {
final State<T> ignoreState;
if (isOptional || isHeadOfOptionalGroupPattern(currentPattern)) {
ignoreState = createState(State.StateType.Normal, false);
ignoreState.addTake(sink, takeCondition);
ignoreState.addIgnore(ignoreCondition);
addStopStates(ignoreState);
} else {
ignoreState = singletonState;
}
singletonState.addIgnore(ignoreState, ignoreCondition);
}
return singletonState;
} | Creates a simple single state. For an OPTIONAL state it also consists of a similar state
without the PROCEED edge, so that for each PROCEED transition branches in computation
state graph can be created only once.
@param ignoreCondition condition that should be applied to IGNORE transition
@param sinkState state that the state being converted should point to
@param proceedState state that the state being converted should proceed to
@param isOptional whether the state being converted is optional
@return the created state | createSingletonState | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
private State<T> createGroupPatternState(
final GroupPattern<T, ?> groupPattern,
final State<T> sinkState,
final State<T> proceedState,
final boolean isOptional) {
final IterativeCondition<T> proceedCondition = getTrueFunction();
Pattern<T, ?> oldCurrentPattern = currentPattern;
Pattern<T, ?> oldFollowingPattern = followingPattern;
GroupPattern<T, ?> oldGroupPattern = currentGroupPattern;
State<T> lastSink = sinkState;
currentGroupPattern = groupPattern;
currentPattern = groupPattern.getRawPattern();
lastSink = createMiddleStates(lastSink);
lastSink = convertPattern(lastSink);
if (isOptional) {
// for the first state of a group pattern, its PROCEED edge should point to
// the following state of that group pattern
lastSink.addProceed(proceedState, proceedCondition);
}
currentPattern = oldCurrentPattern;
followingPattern = oldFollowingPattern;
currentGroupPattern = oldGroupPattern;
return lastSink;
} | Create all the states for the group pattern.
@param groupPattern the group pattern to create the states for
@param sinkState the state that the group pattern being converted should point to
@param proceedState the state that the group pattern being converted should proceed to
@param isOptional whether the group pattern being converted is optional
@return the first state of the states of the group pattern | createGroupPatternState | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
private State<T> createLoopingGroupPatternState(
final GroupPattern<T, ?> groupPattern, final State<T> sinkState) {
final IterativeCondition<T> proceedCondition = getTrueFunction();
Pattern<T, ?> oldCurrentPattern = currentPattern;
Pattern<T, ?> oldFollowingPattern = followingPattern;
GroupPattern<T, ?> oldGroupPattern = currentGroupPattern;
final State<T> dummyState = createState(State.StateType.Normal, true);
State<T> lastSink = dummyState;
currentGroupPattern = groupPattern;
currentPattern = groupPattern.getRawPattern();
lastSink = createMiddleStates(lastSink);
lastSink = convertPattern(lastSink);
lastSink.addProceed(sinkState, proceedCondition);
dummyState.addProceed(lastSink, proceedCondition);
currentPattern = oldCurrentPattern;
followingPattern = oldFollowingPattern;
currentGroupPattern = oldGroupPattern;
return lastSink;
} | Create the states for the group pattern as a looping one.
@param groupPattern the group pattern to create the states for
@param sinkState the state that the group pattern being converted should point to
@return the first state of the states of the group pattern | createLoopingGroupPatternState | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
@SuppressWarnings("unchecked")
private State<T> createLooping(final State<T> sinkState) {
if (currentPattern instanceof GroupPattern) {
return createLoopingGroupPatternState((GroupPattern) currentPattern, sinkState);
}
final IterativeCondition<T> untilCondition =
(IterativeCondition<T>) currentPattern.getUntilCondition();
final IterativeCondition<T> ignoreCondition =
extendWithUntilCondition(
getInnerIgnoreCondition(currentPattern), untilCondition, false);
final IterativeCondition<T> takeCondition =
extendWithUntilCondition(
getTakeCondition(currentPattern), untilCondition, true);
IterativeCondition<T> proceedCondition = getTrueFunction();
final State<T> loopingState = createState(State.StateType.Normal, true);
if (currentPattern.getQuantifier().hasProperty(Quantifier.QuantifierProperty.GREEDY)) {
if (untilCondition != null) {
State<T> sinkStateCopy = copy(sinkState);
loopingState.addProceed(
sinkStateCopy,
new RichAndCondition<>(proceedCondition, untilCondition));
originalStateMap.put(sinkState.getName(), sinkStateCopy);
}
loopingState.addProceed(
sinkState,
untilCondition != null
? new RichAndCondition<>(
proceedCondition, new RichNotCondition<>(untilCondition))
: proceedCondition);
updateWithGreedyCondition(sinkState, getTakeCondition(currentPattern));
} else {
loopingState.addProceed(sinkState, proceedCondition);
}
loopingState.addTake(takeCondition);
addStopStateToLooping(loopingState);
if (ignoreCondition != null) {
final State<T> ignoreState = createState(State.StateType.Normal, false);
ignoreState.addTake(loopingState, takeCondition);
ignoreState.addIgnore(ignoreCondition);
loopingState.addIgnore(ignoreState, ignoreCondition);
addStopStateToLooping(ignoreState);
}
return loopingState;
} | Creates the given state as a looping one. Looping state is one with TAKE edge to itself
and PROCEED edge to the sinkState. It also consists of a similar state without the
PROCEED edge, so that for each PROCEED transition branches in computation state graph can
be created only once.
@param sinkState the state that the converted state should point to
@return the first state of the created complex state | createLooping | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
private IterativeCondition<T> extendWithUntilCondition(
IterativeCondition<T> condition,
IterativeCondition<T> untilCondition,
boolean isTakeCondition) {
if (untilCondition != null && condition != null) {
return new RichAndCondition<>(new RichNotCondition<>(untilCondition), condition);
} else if (untilCondition != null && isTakeCondition) {
return new RichNotCondition<>(untilCondition);
}
return condition;
} | This method extends the given condition with stop(until) condition if necessary. The
until condition needs to be applied only if both of the given conditions are not null.
@param condition the condition to extend
@param untilCondition the until condition to join with the given condition
@param isTakeCondition whether the {@code condition} is for {@code TAKE} edge
@return condition with AND applied or the original condition | extendWithUntilCondition | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
@SuppressWarnings("unchecked")
private IterativeCondition<T> getInnerIgnoreCondition(Pattern<T, ?> pattern) {
Quantifier.ConsumingStrategy consumingStrategy =
pattern.getQuantifier().getInnerConsumingStrategy();
if (headOfGroup(pattern)) {
// for the head pattern of a group pattern, we should consider the
// inner consume strategy of the group pattern
consumingStrategy = currentGroupPattern.getQuantifier().getInnerConsumingStrategy();
}
IterativeCondition<T> innerIgnoreCondition = null;
switch (consumingStrategy) {
case STRICT:
innerIgnoreCondition = null;
break;
case SKIP_TILL_NEXT:
innerIgnoreCondition =
new RichNotCondition<>((IterativeCondition<T>) pattern.getCondition());
break;
case SKIP_TILL_ANY:
innerIgnoreCondition = BooleanConditions.trueFunction();
break;
}
if (currentGroupPattern != null && currentGroupPattern.getUntilCondition() != null) {
innerIgnoreCondition =
extendWithUntilCondition(
innerIgnoreCondition,
(IterativeCondition<T>) currentGroupPattern.getUntilCondition(),
false);
}
return innerIgnoreCondition;
} | @return The {@link IterativeCondition condition} for the {@code IGNORE} edge that
corresponds to the specified {@link Pattern} and extended with stop(until) condition
if necessary. It is applicable only for inner states of a complex state like looping
or times. | getInnerIgnoreCondition | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
@SuppressWarnings("unchecked")
private IterativeCondition<T> getIgnoreCondition(Pattern<T, ?> pattern) {
Quantifier.ConsumingStrategy consumingStrategy =
pattern.getQuantifier().getConsumingStrategy();
if (headOfGroup(pattern)) {
// for the head pattern of a group pattern, we should consider the inner consume
// strategy
// of the group pattern if the group pattern is not the head of the TIMES/LOOPING
// quantifier;
// otherwise, we should consider the consume strategy of the group pattern
if (isCurrentGroupPatternFirstOfLoop()) {
consumingStrategy = currentGroupPattern.getQuantifier().getConsumingStrategy();
} else {
consumingStrategy =
currentGroupPattern.getQuantifier().getInnerConsumingStrategy();
}
}
IterativeCondition<T> ignoreCondition = null;
switch (consumingStrategy) {
case STRICT:
ignoreCondition = null;
break;
case SKIP_TILL_NEXT:
ignoreCondition =
new RichNotCondition<>((IterativeCondition<T>) pattern.getCondition());
break;
case SKIP_TILL_ANY:
ignoreCondition = BooleanConditions.trueFunction();
break;
}
if (currentGroupPattern != null && currentGroupPattern.getUntilCondition() != null) {
ignoreCondition =
extendWithUntilCondition(
ignoreCondition,
(IterativeCondition<T>) currentGroupPattern.getUntilCondition(),
false);
}
return ignoreCondition;
} | @return The {@link IterativeCondition condition} for the {@code IGNORE} edge that
corresponds to the specified {@link Pattern} and extended with stop(until) condition
if necessary. For more on strategy see {@link Quantifier} | getIgnoreCondition | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
@SuppressWarnings("unchecked")
private IterativeCondition<T> getTakeCondition(Pattern<T, ?> pattern) {
IterativeCondition<T> takeCondition = (IterativeCondition<T>) pattern.getCondition();
if (currentGroupPattern != null && currentGroupPattern.getUntilCondition() != null) {
takeCondition =
extendWithUntilCondition(
takeCondition,
(IterativeCondition<T>) currentGroupPattern.getUntilCondition(),
true);
}
return takeCondition;
} | @return the {@link IterativeCondition condition} for the {@code TAKE} edge that
corresponds to the specified {@link Pattern} and extended with stop(until) condition
if necessary. | getTakeCondition | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
@Override
public NFA<T> createNFA() {
return new NFA<>(states, windowTimes, windowTime, timeoutHandling);
} | Implementation of the {@link NFAFactory} interface.
<p>The implementation takes the input type serializer, the window time and the set of states
and their transitions to be able to create an NFA from them.
@param <T> Type of the input events which are processed by the NFA | createNFA | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFACompiler.java | Apache-2.0 |
public static String getOriginalNameFromInternal(String internalName) {
Preconditions.checkNotNull(internalName);
return internalName.split(STATE_NAME_DELIM)[0];
} | Implements the reverse process of the {@link #getUniqueInternalName(String)}.
@param internalName The name to be decoded.
@return The original, user-specified name for the state. | getOriginalNameFromInternal | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFAStateNameHandler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFAStateNameHandler.java | Apache-2.0 |
public void checkNameUniqueness(String name) {
if (usedNames.contains(name)) {
throw new MalformedPatternException(
"Duplicate pattern name: " + name + ". Names must be unique.");
}
usedNames.add(name);
} | Checks if the given name is already used or not. If yes, it throws a {@link
MalformedPatternException}.
@param name The name to be checked. | checkNameUniqueness | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFAStateNameHandler.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/compiler/NFAStateNameHandler.java | Apache-2.0 |
public void lock() {
refCounter += 1;
} | Implements locking logic for incoming event and {@link SharedBufferNode} using a lock reference
counter. | lock | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/Lockable.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/Lockable.java | Apache-2.0 |
void upsertEvent(EventId eventId, Lockable<V> event) {
this.eventsBufferCache.put(eventId, event);
} | Inserts or updates an event in cache.
@param eventId id of the event
@param event event body | upsertEvent | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBuffer.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBuffer.java | Apache-2.0 |
void upsertEntry(NodeId nodeId, Lockable<SharedBufferNode> entry) {
this.entryCache.put(nodeId, entry);
} | Inserts or updates a shareBufferNode in cache.
@param nodeId id of the event
@param entry SharedBufferNode | upsertEntry | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBuffer.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBuffer.java | Apache-2.0 |
void removeEvent(EventId eventId) throws Exception {
this.eventsBufferCache.invalidate(eventId);
this.eventsBuffer.remove(eventId);
} | Removes an event from cache and state.
@param eventId id of the event | removeEvent | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBuffer.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBuffer.java | Apache-2.0 |
void removeEntry(NodeId nodeId) throws Exception {
this.entryCache.invalidate(nodeId);
this.entries.remove(nodeId);
} | Removes a ShareBufferNode from cache and state.
@param nodeId id of the event | removeEntry | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBuffer.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBuffer.java | Apache-2.0 |
Lockable<SharedBufferNode> getEntry(NodeId nodeId) {
try {
Lockable<SharedBufferNode> lockableFromCache = entryCache.getIfPresent(nodeId);
if (Objects.nonNull(lockableFromCache)) {
return lockableFromCache;
} else {
Lockable<SharedBufferNode> lockableFromState = entries.get(nodeId);
if (Objects.nonNull(lockableFromState)) {
entryCache.put(nodeId, lockableFromState);
}
return lockableFromState;
}
} catch (Exception ex) {
throw new WrappingRuntimeException(ex);
}
} | It always returns node either from state or cache.
@param nodeId id of the node
@return SharedBufferNode | getEntry | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBuffer.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBuffer.java | Apache-2.0 |
Lockable<V> getEvent(EventId eventId) {
try {
Lockable<V> lockableFromCache = eventsBufferCache.getIfPresent(eventId);
if (Objects.nonNull(lockableFromCache)) {
return lockableFromCache;
} else {
Lockable<V> lockableFromState = eventsBuffer.get(eventId);
if (Objects.nonNull(lockableFromState)) {
eventsBufferCache.put(eventId, lockableFromState);
}
return lockableFromState;
}
} catch (Exception ex) {
throw new WrappingRuntimeException(ex);
}
} | It always returns event either from state or cache.
@param eventId id of the event
@return event | getEvent | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBuffer.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBuffer.java | Apache-2.0 |
public NodeId put(
final String stateName,
final EventId eventId,
@Nullable final NodeId previousNodeId,
final DeweyNumber version) {
if (previousNodeId != null) {
lockNode(previousNodeId, version);
}
NodeId currentNodeId = new NodeId(eventId, getOriginalNameFromInternal(stateName));
Lockable<SharedBufferNode> currentNode = sharedBuffer.getEntry(currentNodeId);
if (currentNode == null) {
currentNode = new Lockable<>(new SharedBufferNode(), 0);
lockEvent(eventId);
}
currentNode.getElement().addEdge(new SharedBufferEdge(previousNodeId, version));
sharedBuffer.upsertEntry(currentNodeId, currentNode);
return currentNodeId;
} | Stores given value (value + timestamp) under the given state. It assigns a preceding element
relation to the previous entry.
@param stateName name of the state that the event should be assigned to
@param eventId unique id of event assigned by this SharedBuffer
@param previousNodeId id of previous entry (might be null if start of new run)
@param version Version of the previous relation
@return assigned id of this element | put | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBufferAccessor.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBufferAccessor.java | Apache-2.0 |
public List<Map<String, List<EventId>>> extractPatterns(
final NodeId nodeId, final DeweyNumber version) {
List<Map<String, List<EventId>>> result = new ArrayList<>();
// stack to remember the current extraction states
Stack<SharedBufferAccessor.ExtractionState> extractionStates = new Stack<>();
// get the starting shared buffer entry for the previous relation
Lockable<SharedBufferNode> entryLock = sharedBuffer.getEntry(nodeId);
if (entryLock != null) {
SharedBufferNode entry = entryLock.getElement();
extractionStates.add(
new SharedBufferAccessor.ExtractionState(
Tuple2.of(nodeId, entry), version, new Stack<>()));
// use a depth first search to reconstruct the previous relations
while (!extractionStates.isEmpty()) {
final SharedBufferAccessor.ExtractionState extractionState = extractionStates.pop();
// current path of the depth first search
final Stack<Tuple2<NodeId, SharedBufferNode>> currentPath =
extractionState.getPath();
final Tuple2<NodeId, SharedBufferNode> currentEntry = extractionState.getEntry();
// termination criterion
if (currentEntry == null) {
final Map<String, List<EventId>> completePath = new LinkedHashMap<>();
while (!currentPath.isEmpty()) {
final NodeId currentPathEntry = currentPath.pop().f0;
String page = currentPathEntry.getPageName();
List<EventId> values =
completePath.computeIfAbsent(page, k -> new ArrayList<>());
values.add(currentPathEntry.getEventId());
}
result.add(completePath);
} else {
// append state to the path
currentPath.push(currentEntry);
boolean firstMatch = true;
for (Lockable<SharedBufferEdge> lockableEdge : currentEntry.f1.getEdges()) {
// we can only proceed if the current version is compatible to the version
// of this previous relation
final SharedBufferEdge edge = lockableEdge.getElement();
final DeweyNumber currentVersion = extractionState.getVersion();
if (currentVersion.isCompatibleWith(edge.getDeweyNumber())) {
final NodeId target = edge.getTarget();
Stack<Tuple2<NodeId, SharedBufferNode>> newPath;
if (firstMatch) {
// for the first match we don't have to copy the current path
newPath = currentPath;
firstMatch = false;
} else {
newPath = new Stack<>();
newPath.addAll(currentPath);
}
extractionStates.push(
new SharedBufferAccessor.ExtractionState(
target != null
? Tuple2.of(
target,
sharedBuffer
.getEntry(target)
.getElement())
: null,
edge.getDeweyNumber(),
newPath));
}
}
}
}
}
return result;
} | Returns all elements from the previous relation starting at the given entry.
@param nodeId id of the starting entry
@param version Version of the previous relation which shall be extracted
@return Collection of previous relations starting with the given value | extractPatterns | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBufferAccessor.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBufferAccessor.java | Apache-2.0 |
public Map<String, List<V>> materializeMatch(Map<String, List<EventId>> match) {
Map<String, List<V>> materializedMatch =
CollectionUtil.newLinkedHashMapWithExpectedSize(match.size());
for (Map.Entry<String, List<EventId>> pattern : match.entrySet()) {
List<V> events = new ArrayList<>(pattern.getValue().size());
for (EventId eventId : pattern.getValue()) {
try {
V event = sharedBuffer.getEvent(eventId).getElement();
events.add(event);
} catch (Exception ex) {
throw new WrappingRuntimeException(ex);
}
}
materializedMatch.put(pattern.getKey(), events);
}
return materializedMatch;
} | Extracts the real event from the sharedBuffer with pre-extracted eventId.
@param match the matched event's eventId.
@return the event associated with the eventId. | materializeMatch | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBufferAccessor.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBufferAccessor.java | Apache-2.0 |
public void lockNode(final NodeId node, final DeweyNumber version) {
Lockable<SharedBufferNode> sharedBufferNode = sharedBuffer.getEntry(node);
if (sharedBufferNode != null) {
sharedBufferNode.lock();
for (Lockable<SharedBufferEdge> edge : sharedBufferNode.getElement().getEdges()) {
if (version.isCompatibleWith(edge.getElement().getDeweyNumber())) {
edge.lock();
}
}
sharedBuffer.upsertEntry(node, sharedBufferNode);
}
} | Increases the reference counter for the given entry so that it is not accidentally removed.
@param node id of the entry
@param version dewey number of the (potential) edge that locks the given node | lockNode | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBufferAccessor.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBufferAccessor.java | Apache-2.0 |
public void releaseNode(final NodeId node, final DeweyNumber version) throws Exception {
// the stack used to detect all nodes that needs to be released.
Stack<NodeId> nodesToExamine = new Stack<>();
Stack<DeweyNumber> versionsToExamine = new Stack<>();
nodesToExamine.push(node);
versionsToExamine.push(version);
while (!nodesToExamine.isEmpty()) {
NodeId curNode = nodesToExamine.pop();
Lockable<SharedBufferNode> curBufferNode = sharedBuffer.getEntry(curNode);
if (curBufferNode == null) {
break;
}
DeweyNumber currentVersion = versionsToExamine.pop();
List<Lockable<SharedBufferEdge>> edges = curBufferNode.getElement().getEdges();
Iterator<Lockable<SharedBufferEdge>> edgesIterator = edges.iterator();
while (edgesIterator.hasNext()) {
Lockable<SharedBufferEdge> sharedBufferEdge = edgesIterator.next();
SharedBufferEdge edge = sharedBufferEdge.getElement();
if (currentVersion.isCompatibleWith(edge.getDeweyNumber())) {
if (sharedBufferEdge.release()) {
edgesIterator.remove();
NodeId targetId = edge.getTarget();
if (targetId != null) {
nodesToExamine.push(targetId);
versionsToExamine.push(edge.getDeweyNumber());
}
}
}
}
if (curBufferNode.release()) {
// first release the current node
sharedBuffer.removeEntry(curNode);
releaseEvent(curNode.getEventId());
} else {
sharedBuffer.upsertEntry(curNode, curBufferNode);
}
}
} | Decreases the reference counter for the given entry so that it can be removed once the
reference counter reaches 0.
@param node id of the entry
@param version dewey number of the (potential) edge that locked the given node
@throws Exception Thrown if the system cannot access the state. | releaseNode | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBufferAccessor.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBufferAccessor.java | Apache-2.0 |
private void lockEvent(EventId eventId) {
Lockable<V> eventWrapper = sharedBuffer.getEvent(eventId);
checkState(eventWrapper != null, "Referring to non existent event with id %s", eventId);
eventWrapper.lock();
sharedBuffer.upsertEvent(eventId, eventWrapper);
} | Increases the reference counter for the given event so that it is not accidentally removed.
@param eventId id of the entry | lockEvent | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBufferAccessor.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBufferAccessor.java | Apache-2.0 |
public void releaseEvent(EventId eventId) throws Exception {
Lockable<V> eventWrapper = sharedBuffer.getEvent(eventId);
if (eventWrapper != null) {
if (eventWrapper.release()) {
sharedBuffer.removeEvent(eventId);
} else {
sharedBuffer.upsertEvent(eventId, eventWrapper);
}
}
} | Decreases the reference counter for the given event so that it can be removed once the
reference counter reaches 0.
@param eventId id of the event
@throws Exception Thrown if the system cannot access the state. | releaseEvent | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBufferAccessor.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBufferAccessor.java | Apache-2.0 |
private void processEvent(NFAState nfaState, IN event, long timestamp) throws Exception {
try (SharedBufferAccessor<IN> sharedBufferAccessor = partialMatches.getAccessor()) {
Collection<Map<String, List<IN>>> patterns =
nfa.process(
sharedBufferAccessor,
nfaState,
event,
timestamp,
afterMatchSkipStrategy,
cepTimerService);
if (nfa.getWindowTime() > 0 && nfaState.isNewStartPartialMatch()) {
registerTimer(timestamp + nfa.getWindowTime());
}
processMatchedSequences(patterns, timestamp);
}
} | Process the given event by giving it to the NFA and outputting the produced set of matched
event sequences.
@param nfaState Our NFAState object
@param event The current event to be processed
@param timestamp The timestamp of the event | processEvent | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/operator/CepOperator.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/operator/CepOperator.java | Apache-2.0 |
@Override
public long currentProcessingTime() {
return timerService.currentProcessingTime();
} | Gives {@link NFA} access to {@link InternalTimerService} and tells if {@link CepOperator}
works in processing time. Should be instantiated once per operator. | currentProcessingTime | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/operator/CepOperator.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/operator/CepOperator.java | Apache-2.0 |
@Override
public int compare(StreamRecord<IN> o1, StreamRecord<IN> o2) {
if (o1.getTimestamp() < o2.getTimestamp()) {
return -1;
} else if (o1.getTimestamp() > o2.getTimestamp()) {
return 1;
} else {
return 0;
}
} | Compares two {@link StreamRecord}s based on their timestamp.
@param <IN> Type of the value field of the StreamRecord | compare | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/operator/StreamRecordComparator.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/operator/StreamRecordComparator.java | Apache-2.0 |
public static <X> Pattern<X, X> begin(final String name) {
return new Pattern<>(name, null, ConsumingStrategy.STRICT, AfterMatchSkipStrategy.noSkip());
} | Starts a new pattern sequence. The provided name is the one of the initial pattern of the new
sequence. Furthermore, the base type of the event sequence is set.
@param name The name of starting pattern of the new pattern sequence
@param <X> Base type of the event pattern
@return The first pattern of a pattern sequence | begin | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public static <X> Pattern<X, X> begin(
final String name, final AfterMatchSkipStrategy afterMatchSkipStrategy) {
return new Pattern<X, X>(name, null, ConsumingStrategy.STRICT, afterMatchSkipStrategy);
} | Starts a new pattern sequence. The provided name is the one of the initial pattern of the new
sequence. Furthermore, the base type of the event sequence is set.
@param name The name of starting pattern of the new pattern sequence
@param afterMatchSkipStrategy the {@link AfterMatchSkipStrategy.SkipStrategy} to use after
each match.
@param <X> Base type of the event pattern
@return The first pattern of a pattern sequence | begin | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, F> where(IterativeCondition<F> condition) {
Preconditions.checkNotNull(condition, "The condition cannot be null.");
ClosureCleaner.clean(condition, ExecutionConfig.ClosureCleanerLevel.RECURSIVE, true);
if (this.condition == null) {
this.condition = condition;
} else {
this.condition = new RichAndCondition<>(this.condition, condition);
}
return this;
} | Adds a condition that has to be satisfied by an event in order to be considered a match. If
another condition has already been set, the new one is going to be combined with the previous
with a logical {@code AND}. In other case, this is going to be the only condition.
@param condition The condition as an {@link IterativeCondition}.
@return The pattern with the new condition is set. | where | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, F> or(IterativeCondition<F> condition) {
Preconditions.checkNotNull(condition, "The condition cannot be null.");
ClosureCleaner.clean(condition, ExecutionConfig.ClosureCleanerLevel.RECURSIVE, true);
if (this.condition == null) {
this.condition = condition;
} else {
this.condition = new RichOrCondition<>(this.condition, condition);
}
return this;
} | Adds a condition that has to be satisfied by an event in order to be considered a match. If
another condition has already been set, the new one is going to be combined with the previous
with a logical {@code OR}. In other case, this is going to be the only condition.
@param condition The condition as an {@link IterativeCondition}.
@return The pattern with the new condition is set. | or | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public <S extends F> Pattern<T, S> subtype(final Class<S> subtypeClass) {
Preconditions.checkNotNull(subtypeClass, "The class cannot be null.");
if (condition == null) {
this.condition = new SubtypeCondition<F>(subtypeClass);
} else {
this.condition =
new RichAndCondition<>(condition, new SubtypeCondition<F>(subtypeClass));
}
@SuppressWarnings("unchecked")
Pattern<T, S> result = (Pattern<T, S>) this;
return result;
} | Applies a subtype constraint on the current pattern. This means that an event has to be of
the given subtype in order to be matched.
@param subtypeClass Class of the subtype
@param <S> Type of the subtype
@return The same pattern with the new subtype constraint | subtype | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, F> until(IterativeCondition<F> untilCondition) {
Preconditions.checkNotNull(untilCondition, "The condition cannot be null");
if (this.untilCondition != null) {
throw new MalformedPatternException("Only one until condition can be applied.");
}
if (!quantifier.hasProperty(Quantifier.QuantifierProperty.LOOPING)) {
throw new MalformedPatternException(
"The until condition is only applicable to looping states.");
}
ClosureCleaner.clean(untilCondition, ExecutionConfig.ClosureCleanerLevel.RECURSIVE, true);
this.untilCondition = untilCondition;
return this;
} | Applies a stop condition for a looping state. It allows cleaning the underlying state.
@param untilCondition a condition an event has to satisfy to stop collecting events into
looping state
@return The same pattern with applied untilCondition | until | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, F> within(@Nullable Duration windowTime) {
return within(windowTime, WithinType.FIRST_AND_LAST);
} | Defines the maximum time interval in which a matching pattern has to be completed in order to
be considered valid. This interval corresponds to the maximum time gap between first and the
last event.
@param windowTime Time of the matching window
@return The same pattern operator with the new window length | within | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, F> within(@Nullable Duration windowTime, WithinType withinType) {
if (windowTime != null) {
windowTimes.put(withinType, windowTime);
}
return this;
} | Defines the maximum time interval in which a matching pattern has to be completed in order to
be considered valid. This interval corresponds to the maximum time gap between events.
@param withinType Type of the within interval between events
@param windowTime Time of the matching window
@return The same pattern operator with the new window length | within | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, T> next(final String name) {
return new Pattern<>(name, this, ConsumingStrategy.STRICT, afterMatchSkipStrategy);
} | Appends a new pattern to the existing one. The new pattern enforces strict temporal
contiguity. This means that the whole pattern sequence matches only if an event which matches
this pattern directly follows the preceding matching event. Thus, there cannot be any events
in between two matching events.
@param name Name of the new pattern
@return A new pattern which is appended to this one | next | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, T> notNext(final String name) {
if (quantifier.hasProperty(Quantifier.QuantifierProperty.OPTIONAL)) {
throw new UnsupportedOperationException(
"Specifying a pattern with an optional path to NOT condition is not supported yet. "
+ "You can simulate such pattern with two independent patterns, one with and the other without "
+ "the optional part.");
}
return new Pattern<>(name, this, ConsumingStrategy.NOT_NEXT, afterMatchSkipStrategy);
} | Appends a new pattern to the existing one. The new pattern enforces that there is no event
matching this pattern right after the preceding matched event.
@param name Name of the new pattern
@return A new pattern which is appended to this one | notNext | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, T> notFollowedBy(final String name) {
if (quantifier.hasProperty(Quantifier.QuantifierProperty.OPTIONAL)) {
throw new UnsupportedOperationException(
"Specifying a pattern with an optional path to NOT condition is not supported yet. "
+ "You can simulate such pattern with two independent patterns, one with and the other without "
+ "the optional part.");
}
return new Pattern<>(name, this, ConsumingStrategy.NOT_FOLLOW, afterMatchSkipStrategy);
} | Appends a new pattern to the existing one. The new pattern enforces that there is no event
matching this pattern between the preceding pattern and succeeding this one.
<p><b>NOTE:</b> There has to be other pattern after this one.
@param name Name of the new pattern
@return A new pattern which is appended to this one | notFollowedBy | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, F> optional() {
checkIfPreviousPatternGreedy();
quantifier.optional();
return this;
} | Specifies that this pattern is optional for a final match of the pattern sequence to happen.
@return The same pattern as optional.
@throws MalformedPatternException if the quantifier is not applicable to this pattern. | optional | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, F> oneOrMore() {
return oneOrMore((Duration) null);
} | Specifies that this pattern can occur {@code one or more} times. This means at least one and
at most infinite number of events can be matched to this pattern.
<p>If this quantifier is enabled for a pattern {@code A.oneOrMore().followedBy(B)} and a
sequence of events {@code A1 A2 B} appears, this will generate patterns: {@code A1 B} and
{@code A1 A2 B}. See also {@link #allowCombinations()}.
@return The same pattern with a {@link Quantifier#looping(ConsumingStrategy)} quantifier
applied.
@throws MalformedPatternException if the quantifier is not applicable to this pattern. | oneOrMore | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, F> oneOrMore(@Nullable Duration windowTime) {
checkIfNoNotPattern();
checkIfQuantifierApplied();
this.quantifier = Quantifier.looping(quantifier.getConsumingStrategy());
this.times = Times.of(1, windowTime);
return this;
} | Specifies that this pattern can occur {@code one or more} times and time interval corresponds
to the maximum time gap between previous and current event for each times. This means at
least one and at most infinite number of events can be matched to this pattern.
<p>If this quantifier is enabled for a pattern {@code A.oneOrMore().followedBy(B)} and a
sequence of events {@code A1 A2 B} appears, this will generate patterns: {@code A1 B} and
{@code A1 A2 B}. See also {@link #allowCombinations()}.
@param windowTime time of the matching window between times
@return The same pattern with a {@link Quantifier#looping(ConsumingStrategy)} quantifier
applied.
@throws MalformedPatternException if the quantifier is not applicable to this pattern. | oneOrMore | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, F> greedy() {
checkIfNoNotPattern();
checkIfNoGroupPattern();
this.quantifier.greedy();
return this;
} | Specifies that this pattern is greedy. This means as many events as possible will be matched
to this pattern.
@return The same pattern with {@link Quantifier#greedy} set to true.
@throws MalformedPatternException if the quantifier is not applicable to this pattern. | greedy | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, F> times(int times) {
return times(times, null);
} | Specifies exact number of times that this pattern should be matched.
@param times number of times matching event must appear
@return The same pattern with number of times applied
@throws MalformedPatternException if the quantifier is not applicable to this pattern. | times | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, F> times(int times, @Nullable Duration windowTime) {
checkIfNoNotPattern();
checkIfQuantifierApplied();
Preconditions.checkArgument(times > 0, "You should give a positive number greater than 0.");
this.quantifier = Quantifier.times(quantifier.getConsumingStrategy());
this.times = Times.of(times, windowTime);
return this;
} | Specifies exact number of times that this pattern should be matched and time interval
corresponds to the maximum time gap between previous and current event for each times.
@param times number of times matching event must appear
@param windowTime time of the matching window between times
@return The same pattern with number of times applied
@throws MalformedPatternException if the quantifier is not applicable to this pattern. | times | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, F> times(int from, int to) {
return times(from, to, null);
} | Specifies that the pattern can occur between from and to times.
@param from number of times matching event must appear at least
@param to number of times matching event must appear at most
@return The same pattern with the number of times range applied
@throws MalformedPatternException if the quantifier is not applicable to this pattern. | times | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, F> times(int from, int to, @Nullable Duration windowTime) {
checkIfNoNotPattern();
checkIfQuantifierApplied();
this.quantifier = Quantifier.times(quantifier.getConsumingStrategy());
if (from == 0) {
this.quantifier.optional();
from = 1;
}
this.times = Times.of(from, to, windowTime);
return this;
} | Specifies that the pattern can occur between from and to times with time interval corresponds
to the maximum time gap between previous and current event for each times.
@param from number of times matching event must appear at least
@param to number of times matching event must appear at most
@param windowTime time of the matching window between times
@return The same pattern with the number of times range applied
@throws MalformedPatternException if the quantifier is not applicable to this pattern. | times | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, F> timesOrMore(int times) {
return timesOrMore(times, null);
} | Specifies that this pattern can occur the specified times at least. This means at least the
specified times and at most infinite number of events can be matched to this pattern.
@return The same pattern with a {@link Quantifier#looping(ConsumingStrategy)} quantifier
applied.
@throws MalformedPatternException if the quantifier is not applicable to this pattern. | timesOrMore | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, F> timesOrMore(int times, @Nullable Duration windowTime) {
checkIfNoNotPattern();
checkIfQuantifierApplied();
this.quantifier = Quantifier.looping(quantifier.getConsumingStrategy());
this.times = Times.of(times, windowTime);
return this;
} | Specifies that this pattern can occur the specified times at least with interval corresponds
to the maximum time gap between previous and current event for each times. This means at
least the specified times and at most infinite number of events can be matched to this
pattern.
@param times number of times at least matching event must appear
@param windowTime time of the matching window between times
@return The same pattern with a {@link Quantifier#looping(ConsumingStrategy)} quantifier
applied.
@throws MalformedPatternException if the quantifier is not applicable to this pattern. | timesOrMore | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, F> allowCombinations() {
quantifier.combinations();
return this;
} | Applicable only to {@link Quantifier#looping(ConsumingStrategy)} and {@link
Quantifier#times(ConsumingStrategy)} patterns, this option allows more flexibility to the
matching events.
<p>If {@code allowCombinations()} is not applied for a pattern {@code
A.oneOrMore().followedBy(B)} and a sequence of events {@code A1 A2 B} appears, this will
generate patterns: {@code A1 B} and {@code A1 A2 B}. If this method is applied, we will have
{@code A1 B}, {@code A2 B} and {@code A1 A2 B}.
@return The same pattern with the updated quantifier. *
@throws MalformedPatternException if the quantifier is not applicable to this pattern. | allowCombinations | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public Pattern<T, F> consecutive() {
quantifier.consecutive();
return this;
} | Works in conjunction with {@link Pattern#oneOrMore()} or {@link Pattern#times(int)}.
Specifies that any not matching element breaks the loop.
<p>E.g. a pattern like:
<pre>{@code
Pattern.<Event>begin("start").where(new SimpleCondition<Event>() {
@Override
public boolean filter(Event value) throws Exception {
return value.getName().equals("c");
}
})
.followedBy("middle").where(new SimpleCondition<Event>() {
@Override
public boolean filter(Event value) throws Exception {
return value.getName().equals("a");
}
}).oneOrMore().consecutive()
.followedBy("end1").where(new SimpleCondition<Event>() {
@Override
public boolean filter(Event value) throws Exception {
return value.getName().equals("b");
}
});
}</pre>
<p>for a sequence: C D A1 A2 A3 D A4 B
<p>will generate matches: {C A1 B}, {C A1 A2 B}, {C A1 A2 A3 B}
<p>By default a relaxed continuity is applied.
@return pattern with continuity changed to strict | consecutive | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public static <T, F extends T> GroupPattern<T, F> begin(
final Pattern<T, F> group, final AfterMatchSkipStrategy afterMatchSkipStrategy) {
return new GroupPattern<>(null, group, ConsumingStrategy.STRICT, afterMatchSkipStrategy);
} | Starts a new pattern sequence. The provided pattern is the initial pattern of the new
sequence.
@param group the pattern to begin with
@param afterMatchSkipStrategy the {@link AfterMatchSkipStrategy.SkipStrategy} to use after
each match.
@return The first pattern of a pattern sequence | begin | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public static <T, F extends T> GroupPattern<T, F> begin(Pattern<T, F> group) {
return new GroupPattern<>(
null, group, ConsumingStrategy.STRICT, AfterMatchSkipStrategy.noSkip());
} | Starts a new pattern sequence. The provided pattern is the initial pattern of the new
sequence.
@param group the pattern to begin with
@return the first pattern of a pattern sequence | begin | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/Pattern.java | Apache-2.0 |
public static <T> IterativeCondition<T> trueFunction() {
return SimpleCondition.of(value -> true);
} | @return An {@link IterativeCondition} that always returns {@code true}. | trueFunction | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/conditions/BooleanConditions.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/conditions/BooleanConditions.java | Apache-2.0 |
public static <T> IterativeCondition<T> falseFunction() {
return SimpleCondition.of(value -> false);
} | @return An {@link IterativeCondition} that always returns {@code false}. | falseFunction | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/conditions/BooleanConditions.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/conditions/BooleanConditions.java | Apache-2.0 |
@Override
public boolean filter(T value, Context<T> ctx) throws Exception {
return getLeft().filter(value, ctx) && getRight().filter(value, ctx);
} | A {@link RichIterativeCondition condition} which combines two conditions with a logical {@code
AND} and returns {@code true} if both are {@code true}.
@param <T> Type of the element to filter | filter | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/conditions/RichAndCondition.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/conditions/RichAndCondition.java | Apache-2.0 |
@Override
public boolean filter(T value, Context<T> ctx) throws Exception {
return !getNestedConditions()[0].filter(value, ctx);
} | A {@link RichIterativeCondition condition} which negates the condition it wraps and returns
{@code true} if the original condition returns {@code false}.
@param <T> Type of the element to filter | filter | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/conditions/RichNotCondition.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/conditions/RichNotCondition.java | Apache-2.0 |
@Override
public boolean filter(T value, Context<T> ctx) throws Exception {
return getLeft().filter(value, ctx) || getRight().filter(value, ctx);
} | A {@link RichIterativeCondition condition} which combines two conditions with a logical {@code
OR} and returns {@code true} if at least one is {@code true}.
@param <T> Type of the element to filter | filter | java | apache/flink | flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/conditions/RichOrCondition.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/main/java/org/apache/flink/cep/pattern/conditions/RichOrCondition.java | Apache-2.0 |
@Parameterized.Parameters
public static Collection<Configuration> prepareSharedBufferCacheConfig() {
Configuration miniCacheConfig = new Configuration();
miniCacheConfig.set(CEPCacheOptions.CEP_CACHE_STATISTICS_INTERVAL, Duration.ofSeconds(1));
miniCacheConfig.set(CEPCacheOptions.CEP_SHARED_BUFFER_ENTRY_CACHE_SLOTS, 1);
miniCacheConfig.set(CEPCacheOptions.CEP_SHARED_BUFFER_EVENT_CACHE_SLOTS, 1);
Configuration bigCacheConfig = new Configuration();
miniCacheConfig.set(CEPCacheOptions.CEP_CACHE_STATISTICS_INTERVAL, Duration.ofSeconds(1));
return Arrays.asList(miniCacheConfig, bigCacheConfig);
} | End to end tests of both CEP operators and {@link NFA}. | prepareSharedBufferCacheConfig | java | apache/flink | flink-libraries/flink-cep/src/test/java/org/apache/flink/cep/CEPITCase.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/test/java/org/apache/flink/cep/CEPITCase.java | Apache-2.0 |
@Test
public void testSimplePatternWithTimeWindowNFAWithinFirstAndLast() throws Exception {
List<StreamRecord<Event>> events = new ArrayList<>();
final Event startEvent;
final Event middleEvent;
final Event endEvent;
events.add(new StreamRecord<>(new Event(1, "start", 1.0), 1));
events.add(new StreamRecord<>(startEvent = new Event(2, "start", 1.0), 2));
events.add(new StreamRecord<>(middleEvent = new Event(3, "middle", 1.0), 3));
events.add(new StreamRecord<>(new Event(4, "foobar", 1.0), 4));
events.add(new StreamRecord<>(endEvent = new Event(5, "end", 1.0), 11));
events.add(new StreamRecord<>(new Event(6, "end", 1.0), 13));
Pattern<Event, ?> pattern =
Pattern.<Event>begin("start")
.where(SimpleCondition.of(value -> value.getName().equals("start")))
.followedBy("middle")
.where(SimpleCondition.of(value -> value.getName().equals("middle")))
.followedBy("end")
.where(SimpleCondition.of(value -> value.getName().equals("end")))
.within(Duration.ofMillis(10));
NFA<Event> nfa = compile(pattern, false);
List<List<Event>> resultingPatterns = feedNFA(events, nfa);
comparePatterns(
resultingPatterns,
Lists.<List<Event>>newArrayList(
Lists.newArrayList(startEvent, middleEvent, endEvent)));
} | Tests that the NFA successfully filters out expired elements with respect to the window
length between the first and last event. | testSimplePatternWithTimeWindowNFAWithinFirstAndLast | java | apache/flink | flink-libraries/flink-cep/src/test/java/org/apache/flink/cep/nfa/NFAITCase.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/test/java/org/apache/flink/cep/nfa/NFAITCase.java | Apache-2.0 |
@Test
public void testSimplePatternWithTimeWindowNFAWithinPreviousAndCurrent() throws Exception {
List<StreamRecord<Event>> events = new ArrayList<>();
final Event startEvent1;
final Event startEvent2;
final Event middleEvent;
final Event endEvent;
events.add(new StreamRecord<>(startEvent1 = new Event(1, "start", 1.0), 1));
events.add(new StreamRecord<>(startEvent2 = new Event(2, "start", 1.0), 2));
events.add(new StreamRecord<>(middleEvent = new Event(3, "middle", 1.0), 3));
events.add(new StreamRecord<>(new Event(4, "foobar", 1.0), 4));
events.add(new StreamRecord<>(endEvent = new Event(5, "end", 1.0), 11));
events.add(new StreamRecord<>(new Event(6, "end", 1.0), 13));
Pattern<Event, ?> pattern =
Pattern.<Event>begin("start")
.where(SimpleCondition.of(value -> value.getName().equals("start")))
.followedBy("middle")
.where(SimpleCondition.of(value -> value.getName().equals("middle")))
.followedBy("end")
.where(SimpleCondition.of(value -> value.getName().equals("end")))
.within(Duration.ofMillis(9), WithinType.PREVIOUS_AND_CURRENT);
NFA<Event> nfa = compile(pattern, false);
List<List<Event>> resultingPatterns = feedNFA(events, nfa);
comparePatterns(
resultingPatterns,
Lists.<List<Event>>newArrayList(
Lists.newArrayList(startEvent1, middleEvent, endEvent),
Lists.newArrayList(startEvent2, middleEvent, endEvent)));
} | Tests that the NFA successfully filters out expired elements with respect to the window
length between the previous and current event. | testSimplePatternWithTimeWindowNFAWithinPreviousAndCurrent | java | apache/flink | flink-libraries/flink-cep/src/test/java/org/apache/flink/cep/nfa/NFAITCase.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/test/java/org/apache/flink/cep/nfa/NFAITCase.java | Apache-2.0 |
@Test
public void testSimplePatternWithTimeoutHandlingWithinFirstAndLast() throws Exception {
List<StreamRecord<Event>> events = new ArrayList<>();
List<Map<String, List<Event>>> resultingPatterns = new ArrayList<>();
Set<Tuple2<Map<String, List<Event>>, Long>> resultingTimeoutPatterns = new HashSet<>();
Set<Tuple2<Map<String, List<Event>>, Long>> expectedTimeoutPatterns = new HashSet<>();
events.add(new StreamRecord<>(new Event(1, "start", 1.0), 1));
events.add(new StreamRecord<>(new Event(2, "start", 1.0), 2));
events.add(new StreamRecord<>(new Event(3, "middle", 1.0), 3));
events.add(new StreamRecord<>(new Event(4, "foobar", 1.0), 4));
events.add(new StreamRecord<>(new Event(5, "end", 1.0), 11));
events.add(new StreamRecord<>(new Event(6, "end", 1.0), 13));
Map<String, List<Event>> timeoutPattern1 = new HashMap<>();
timeoutPattern1.put("start", Collections.singletonList(new Event(1, "start", 1.0)));
timeoutPattern1.put("middle", Collections.singletonList(new Event(3, "middle", 1.0)));
Map<String, List<Event>> timeoutPattern2 = new HashMap<>();
timeoutPattern2.put("start", Collections.singletonList(new Event(2, "start", 1.0)));
timeoutPattern2.put("middle", Collections.singletonList(new Event(3, "middle", 1.0)));
Map<String, List<Event>> timeoutPattern3 = new HashMap<>();
timeoutPattern3.put("start", Collections.singletonList(new Event(1, "start", 1.0)));
Map<String, List<Event>> timeoutPattern4 = new HashMap<>();
timeoutPattern4.put("start", Collections.singletonList(new Event(2, "start", 1.0)));
expectedTimeoutPatterns.add(Tuple2.of(timeoutPattern1, 11L));
expectedTimeoutPatterns.add(Tuple2.of(timeoutPattern2, 12L));
expectedTimeoutPatterns.add(Tuple2.of(timeoutPattern3, 11L));
expectedTimeoutPatterns.add(Tuple2.of(timeoutPattern4, 12L));
Pattern<Event, ?> pattern =
Pattern.<Event>begin("start")
.where(SimpleCondition.of(value -> value.getName().equals("start")))
.followedByAny("middle")
.where(SimpleCondition.of(value -> value.getName().equals("middle")))
.followedByAny("end")
.where(SimpleCondition.of(value -> value.getName().equals("end")))
.within(Duration.ofMillis(10));
NFA<Event> nfa = compile(pattern, true);
NFAState nfaState = nfa.createInitialNFAState();
for (StreamRecord<Event> event : events) {
Collection<Tuple2<Map<String, List<Event>>, Long>> timeoutPatterns =
nfa.advanceTime(
sharedBufferAccessor,
nfaState,
event.getTimestamp(),
AfterMatchSkipStrategy.noSkip())
.f1;
Collection<Map<String, List<Event>>> matchedPatterns =
nfa.process(
sharedBufferAccessor,
nfaState,
event.getValue(),
event.getTimestamp(),
AfterMatchSkipStrategy.noSkip(),
new TestTimerService());
resultingPatterns.addAll(matchedPatterns);
resultingTimeoutPatterns.addAll(timeoutPatterns);
}
assertEquals(1, resultingPatterns.size());
assertEquals(expectedTimeoutPatterns.size(), resultingTimeoutPatterns.size());
assertEquals(expectedTimeoutPatterns, resultingTimeoutPatterns);
} | Tests that the NFA successfully returns partially matched event sequences when they've timed
out within the first and last event. | testSimplePatternWithTimeoutHandlingWithinFirstAndLast | java | apache/flink | flink-libraries/flink-cep/src/test/java/org/apache/flink/cep/nfa/NFAITCase.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/test/java/org/apache/flink/cep/nfa/NFAITCase.java | Apache-2.0 |
@Test
public void testWindowBorders() throws Exception {
List<StreamRecord<Event>> streamEvents = new ArrayList<>();
streamEvents.add(new StreamRecord<>(new Event(1, "start", 1.0), 1L));
streamEvents.add(new StreamRecord<>(new Event(2, "end", 2.0), 3L));
List<Map<String, List<Event>>> expectedPatterns = Collections.emptyList();
NFA<Event> nfa = createStartEndNFA();
NFATestHarness nfaTestHarness = NFATestHarness.forNFA(nfa).build();
Collection<Map<String, List<Event>>> actualPatterns =
nfaTestHarness.consumeRecords(streamEvents);
assertEquals(expectedPatterns, actualPatterns);
} | Tests that elements whose timestamp difference is exactly the window length are not matched.
The reason is that the right window side (later elements) is exclusive. | testWindowBorders | java | apache/flink | flink-libraries/flink-cep/src/test/java/org/apache/flink/cep/nfa/NFATest.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/test/java/org/apache/flink/cep/nfa/NFATest.java | Apache-2.0 |
@Override
public boolean filter(Event value) throws Exception {
throw new RuntimeException("It should never arrive here.");
} | A filter implementation to test invalid pattern specification with duplicate pattern names.
Check {@link #testNFACompilerUniquePatternName()}. | filter | java | apache/flink | flink-libraries/flink-cep/src/test/java/org/apache/flink/cep/nfa/compiler/NFACompilerTest.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/test/java/org/apache/flink/cep/nfa/compiler/NFACompilerTest.java | Apache-2.0 |
@Test
public void testDuplicate() {
IntSerializer nonDuplicatingInnerSerializer = IntSerializer.INSTANCE;
Assert.assertSame(nonDuplicatingInnerSerializer, nonDuplicatingInnerSerializer.duplicate());
Lockable.LockableTypeSerializer<Integer> candidateTestShallowDuplicate =
new Lockable.LockableTypeSerializer<>(nonDuplicatingInnerSerializer);
Assert.assertSame(candidateTestShallowDuplicate, candidateTestShallowDuplicate.duplicate());
TestDuplicateSerializer duplicatingInnerSerializer = new TestDuplicateSerializer();
Assert.assertNotSame(duplicatingInnerSerializer, duplicatingInnerSerializer.duplicate());
Lockable.LockableTypeSerializer<Integer> candidateTestDeepDuplicate =
new Lockable.LockableTypeSerializer<>(duplicatingInnerSerializer);
Lockable.LockableTypeSerializer<Integer> deepDuplicate =
candidateTestDeepDuplicate.duplicate();
Assert.assertNotSame(candidateTestDeepDuplicate, deepDuplicate);
Assert.assertNotSame(
candidateTestDeepDuplicate.getElementSerializer(),
deepDuplicate.getElementSerializer());
} | This tests that {@link Lockable.LockableTypeSerializer#duplicate()} works as expected. | testDuplicate | java | apache/flink | flink-libraries/flink-cep/src/test/java/org/apache/flink/cep/nfa/sharedbuffer/LockableTypeSerializerTest.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/test/java/org/apache/flink/cep/nfa/sharedbuffer/LockableTypeSerializerTest.java | Apache-2.0 |
@Test
public void testReleaseNodesWithLongPath() throws Exception {
SharedBuffer<Event> sharedBuffer =
TestSharedBuffer.createTestBuffer(Event.createTypeSerializer(), cacheConfig);
final int numberEvents = 100000;
Event[] events = new Event[numberEvents];
EventId[] eventIds = new EventId[numberEvents];
NodeId[] nodeIds = new NodeId[numberEvents];
final long timestamp = 1L;
for (int i = 0; i < numberEvents; i++) {
events[i] = new Event(i + 1, "e" + (i + 1), i);
eventIds[i] = sharedBuffer.registerEvent(events[i], timestamp);
}
try (SharedBufferAccessor<Event> sharedBufferAccessor = sharedBuffer.getAccessor()) {
for (int i = 0; i < numberEvents; i++) {
NodeId prevId = i == 0 ? null : nodeIds[i - 1];
nodeIds[i] =
sharedBufferAccessor.put(
"n" + i, eventIds[i], prevId, DeweyNumber.fromString("1.0"));
}
NodeId lastNode = nodeIds[numberEvents - 1];
sharedBufferAccessor.releaseNode(lastNode, DeweyNumber.fromString("1.0"));
for (int i = 0; i < numberEvents; i++) {
sharedBufferAccessor.releaseEvent(eventIds[i]);
}
}
assertTrue(sharedBuffer.isEmpty());
} | Test releasing a node which has a long path to the terminal node (the node without an
out-going edge).
@throws Exception if creating the shared buffer accessor fails. | testReleaseNodesWithLongPath | java | apache/flink | flink-libraries/flink-cep/src/test/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBufferTest.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/test/java/org/apache/flink/cep/nfa/sharedbuffer/SharedBufferTest.java | Apache-2.0 |
private static PatternProcessFunction<Event, String> extractTimestampAndNames(int stateNumber) {
return new AccessContextWithNames(
stateNumber, context -> String.valueOf(context.timestamp()));
} | Creates a {@link PatternProcessFunction} that as a result will produce Strings as follows:
{@code [timestamp]:[Event.getName]...}. The Event.getName will occur stateNumber times. If
the match does not contain n-th pattern it will replace this position with "null".
@param stateNumber number of states in the pattern
@return created PatternProcessFunction | extractTimestampAndNames | java | apache/flink | flink-libraries/flink-cep/src/test/java/org/apache/flink/cep/operator/CepProcessFunctionContextTest.java | https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/test/java/org/apache/flink/cep/operator/CepProcessFunctionContextTest.java | Apache-2.0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.