code stringlengths 25 201k | docstring stringlengths 19 96.2k | func_name stringlengths 0 235 | language stringclasses 1 value | repo stringlengths 8 51 | path stringlengths 11 314 | url stringlengths 62 377 | license stringclasses 7 values |
|---|---|---|---|---|---|---|---|
public static InputTypeStrategy or(InputTypeStrategy... strategies) {
return new OrInputTypeStrategy(Arrays.asList(strategies));
} | Strategy for a disjunction of multiple {@link InputTypeStrategy}s into one like {@code
f(NUMERIC) || f(STRING)}.
<p>This strategy aims to infer a list of types that are equal to the input types (to prevent
unnecessary casting) or (if this is not possible) the first more specific, casted types. | or | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | Apache-2.0 |
public static InputTypeStrategy wildcardWithCount(ArgumentCount argumentCount) {
return new WildcardInputTypeStrategy(argumentCount);
} | Strategy that does not perform any modification or validation of the input. It checks the
argument count though. | wildcardWithCount | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | Apache-2.0 |
public static InputTypeStrategy comparable(
ConstantArgumentCount argumentCount, StructuredComparison requiredComparison) {
return new ComparableTypeStrategy(argumentCount, requiredComparison);
} | Strategy that checks all types are comparable with each other. Requires at least one
argument. | comparable | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | Apache-2.0 |
public static ExplicitArgumentTypeStrategy explicit(DataType expectedDataType) {
return new ExplicitArgumentTypeStrategy(expectedDataType);
} | Strategy for an argument that corresponds to an explicitly defined type casting. Implicit
casts will be inserted if possible. | explicit | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | Apache-2.0 |
public static RootArgumentTypeStrategy logical(LogicalTypeRoot expectedRoot) {
return new RootArgumentTypeStrategy(expectedRoot, null);
} | Strategy for an argument that corresponds to a given {@link LogicalTypeRoot}. Implicit casts
will be inserted if possible. | logical | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | Apache-2.0 |
public static RootArgumentTypeStrategy logical(
LogicalTypeRoot expectedRoot, boolean expectedNullability) {
return new RootArgumentTypeStrategy(expectedRoot, expectedNullability);
} | Strategy for an argument that corresponds to a given {@link LogicalTypeRoot} and nullability.
Implicit casts will be inserted if possible. | logical | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | Apache-2.0 |
public static FamilyArgumentTypeStrategy logical(LogicalTypeFamily expectedFamily) {
return new FamilyArgumentTypeStrategy(expectedFamily, null);
} | Strategy for an argument that corresponds to a given {@link LogicalTypeFamily}. Implicit
casts will be inserted if possible. | logical | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | Apache-2.0 |
public static FamilyArgumentTypeStrategy logical(
LogicalTypeFamily expectedFamily, boolean expectedNullability) {
return new FamilyArgumentTypeStrategy(expectedFamily, expectedNullability);
} | Strategy for an argument that corresponds to a given {@link LogicalTypeFamily} and
nullability. Implicit casts will be inserted if possible. | logical | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | Apache-2.0 |
public static AndArgumentTypeStrategy and(ArgumentTypeStrategy... strategies) {
return new AndArgumentTypeStrategy(Arrays.asList(strategies));
} | Strategy for a conjunction of multiple {@link ArgumentTypeStrategy}s into one like {@code
f(NUMERIC && LITERAL)}.
<p>Some {@link ArgumentTypeStrategy}s cannot contribute an inferred type that is different
from the input type (e.g. {@link #LITERAL}). Therefore, the order {@code f(X && Y)} or {@code
f(Y && X)} matters as it defines the precedence in case the result must be casted to a more
specific type.
<p>This strategy aims to infer the first more specific, casted type or (if this is not
possible) a type that has been inferred from all {@link ArgumentTypeStrategy}s. | and | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | Apache-2.0 |
public static OrArgumentTypeStrategy or(ArgumentTypeStrategy... strategies) {
return new OrArgumentTypeStrategy(Arrays.asList(strategies));
} | Strategy for a disjunction of multiple {@link ArgumentTypeStrategy}s into one like {@code
f(NUMERIC || STRING)}.
<p>Some {@link ArgumentTypeStrategy}s cannot contribute an inferred type that is different
from the input type (e.g. {@link #LITERAL}). Therefore, the order {@code f(X || Y)} or {@code
f(Y || X)} matters as it defines the precedence in case the result must be casted to a more
specific type.
<p>This strategy aims to infer a type that is equal to the input type (to prevent unnecessary
casting) or (if this is not possible) the first more specific, casted type. | or | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | Apache-2.0 |
public static SymbolArgumentTypeStrategy<?> symbol(
Class<? extends Enum<? extends TableSymbol>> clazz) {
return new SymbolArgumentTypeStrategy<>(clazz);
} | Strategy for a symbol argument of a specific {@link TableSymbol} enum.
<p>A symbol is implied to be a literal argument. | symbol | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | Apache-2.0 |
@SafeVarargs
@SuppressWarnings("unchecked")
public static <T extends Enum<? extends TableSymbol>> SymbolArgumentTypeStrategy<T> symbol(
T firstAllowedVariant, T... otherAllowedVariants) {
return new SymbolArgumentTypeStrategy<T>(
(Class<T>) firstAllowedVariant.getClass(),
Stream.concat(Stream.of(firstAllowedVariant), Arrays.stream(otherAllowedVariants))
.collect(Collectors.toSet()));
} | Strategy for a symbol argument of a specific {@link TableSymbol} enum, with value being one
of the provided variants.
<p>A symbol is implied to be a literal argument. | symbol | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | Apache-2.0 |
public static InputTypeStrategy commonType(int count) {
return new CommonInputTypeStrategy(ConstantArgumentCount.of(count));
} | An {@link InputTypeStrategy} that expects {@code count} arguments that have a common type. | commonType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | Apache-2.0 |
public static InputTypeStrategy commonArrayType(int count) {
return new CommonArrayInputTypeStrategy(ConstantArgumentCount.of(count));
} | An {@link InputTypeStrategy} that expects {@code count} arguments that have a common array
type. | commonArrayType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | Apache-2.0 |
public static InputTypeStrategy commonMultipleArrayType(int minCount) {
return new CommonArrayInputTypeStrategy(ConstantArgumentCount.from(minCount));
} | An {@link InputTypeStrategy} that expects {@code minCount} arguments that have a common array
type. | commonMultipleArrayType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | Apache-2.0 |
public static InputTypeStrategy commonMapType(int minCount) {
return new CommonMapInputTypeStrategy(ConstantArgumentCount.from(minCount));
} | An {@link InputTypeStrategy} that expects {@code minCount} arguments that have a common map
type. | commonMapType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/InputTypeStrategies.java | Apache-2.0 |
public static StaticArgument scalar(String name, DataType dataType, boolean isOptional) {
Preconditions.checkNotNull(dataType, "Data type must not be null.");
return new StaticArgument(
name, dataType, null, isOptional, EnumSet.of(StaticArgumentTrait.SCALAR));
} | Declares a scalar argument such as {@code f(12)} or {@code f(otherColumn)}.
@param name name for the assignment operator e.g. {@code f(myArg => 12)}
@param dataType explicit type to which the argument is cast if necessary
@param isOptional whether the argument is optional, if optional the corresponding data type
must be nullable | scalar | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/StaticArgument.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/StaticArgument.java | Apache-2.0 |
public static StaticArgument table(
String name,
Class<?> conversionClass,
boolean isOptional,
EnumSet<StaticArgumentTrait> traits) {
Preconditions.checkNotNull(conversionClass, "Conversion class must not be null.");
final EnumSet<StaticArgumentTrait> enrichedTraits = EnumSet.copyOf(traits);
enrichedTraits.add(StaticArgumentTrait.TABLE);
if (!enrichedTraits.contains(StaticArgumentTrait.TABLE_AS_SET)) {
enrichedTraits.add(StaticArgumentTrait.TABLE_AS_ROW);
}
return new StaticArgument(name, null, conversionClass, isOptional, enrichedTraits);
} | Declares a table argument such as {@code f(t => myTable)} or {@code f(t => TABLE myTable))}.
<p>The argument can have {@link StaticArgumentTrait#TABLE_AS_ROW} (default) or {@link
StaticArgumentTrait#TABLE_AS_SET} semantics.
<p>By only providing a conversion class, the argument supports a "polymorphic" behavior. In
other words: it accepts tables with an arbitrary number of columns with arbitrary data types.
For this case, a class satisfying {@link RowType#supportsOutputConversion(Class)} must be
used.
@param name name for the assignment operator e.g. {@code f(myArg => 12)}
@param conversionClass a class satisfying {@link RowType#supportsOutputConversion(Class)}
@param isOptional whether the argument is optional
@param traits set of {@link StaticArgumentTrait} requiring {@link StaticArgumentTrait#TABLE} | table | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/StaticArgument.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/StaticArgument.java | Apache-2.0 |
public Set<StaticArgumentTrait> getRequirements() {
return requirements;
} | Declares traits for {@link StaticArgument}. They enable basic validation by the framework.
<p>Some traits have dependencies to other traits, which is why this enum reflects a hierarchy in
which {@link #SCALAR}, {@link #TABLE}, and {@link #MODEL} are the top-level roots. | getRequirements | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/StaticArgumentTrait.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/StaticArgumentTrait.java | Apache-2.0 |
public Builder inputTypeStrategy(InputTypeStrategy inputTypeStrategy) {
this.inputTypeStrategy =
Preconditions.checkNotNull(
inputTypeStrategy, "Input type strategy must not be null.");
return this;
} | Sets the strategy for inferring and validating input arguments in a function call.
<p>A {@link InputTypeStrategies#WILDCARD} strategy function is assumed by default. | inputTypeStrategy | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInference.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInference.java | Apache-2.0 |
public Builder accumulatorTypeStrategy(TypeStrategy accumulatorTypeStrategy) {
Preconditions.checkNotNull(
accumulatorTypeStrategy, "Accumulator type strategy must not be null.");
this.stateTypeStrategies.put(
UserDefinedFunctionHelper.DEFAULT_ACCUMULATOR_NAME,
StateTypeStrategy.of(accumulatorTypeStrategy));
return this;
} | Sets the strategy for inferring the intermediate accumulator data type of an aggregate
function call. | accumulatorTypeStrategy | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInference.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInference.java | Apache-2.0 |
public Builder stateTypeStrategies(
LinkedHashMap<String, StateTypeStrategy> stateTypeStrategies) {
this.stateTypeStrategies = stateTypeStrategies;
return this;
} | Sets a map of state names to {@link StateTypeStrategy}s for inferring a function call's
intermediate result data types (i.e. state entries). For aggregate functions, only one
entry is allowed which defines the accumulator's data type. | stateTypeStrategies | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInference.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInference.java | Apache-2.0 |
@Deprecated
public Builder namedArguments(List<String> argumentNames) {
this.namedArguments =
Preconditions.checkNotNull(
argumentNames, "List of argument names must not be null.");
return this;
} | Sets the list of argument names for specifying a fixed, not overloaded, not vararg input
signature explicitly.
<p>This information is useful for SQL's concept of named arguments using the assignment
operator (e.g. {@code FUNC(max => 42)}). The names are used for reordering the call's
arguments to the formal argument order of the function.
@deprecated Use {@link #staticArguments(List)} instead. | namedArguments | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInference.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInference.java | Apache-2.0 |
@Deprecated
public Builder namedArguments(String... argumentNames) {
return namedArguments(Arrays.asList(argumentNames));
} | @see #namedArguments(List)
@deprecated Use {@link #staticArguments(StaticArgument...)} instead. | namedArguments | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInference.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInference.java | Apache-2.0 |
@Deprecated
public Builder optionalArguments(List<Boolean> optionalArguments) {
this.optionalArguments =
Preconditions.checkNotNull(
optionalArguments, "List of argument optionals must not be null.");
return this;
} | Sets the list of argument optionals for specifying optional arguments in the input
signature explicitly.
<p>This information is useful for SQL's concept of named arguments using the assignment
operator. The optionals are used to determine whether an argument is optional or required
in the function call.
@deprecated Use {@link #staticArguments(List)} instead. | optionalArguments | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInference.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInference.java | Apache-2.0 |
@Deprecated
public Builder typedArguments(List<DataType> argumentTypes) {
this.typedArguments =
Preconditions.checkNotNull(
argumentTypes, "List of argument types must not be null.");
return this;
} | Sets the list of argument types for specifying a fixed, not overloaded, not vararg input
signature explicitly.
<p>This information is useful for optional arguments with default value. In particular,
the number of arguments that need to be filled with a default value and their types is
@deprecated Use {@link #staticArguments(List)} instead. | typedArguments | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInference.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInference.java | Apache-2.0 |
public static Result runTypeInference(
TypeInference typeInference,
CallContext callContext,
@Nullable SurroundingInfo surroundingInfo) {
try {
return runTypeInferenceInternal(typeInference, callContext, surroundingInfo);
} catch (ValidationException e) {
throw createInvalidCallException(callContext, e);
} catch (Throwable t) {
throw createUnexpectedException(callContext, t);
}
} | Runs the entire type inference process.
@param typeInference type inference of the current call
@param callContext call context of the current call
@param surroundingInfo information about the outer wrapping call of a current function call
for performing input type inference | runTypeInference | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInferenceUtil.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInferenceUtil.java | Apache-2.0 |
public static DataType inferOutputType(
CallContext callContext, TypeStrategy outputTypeStrategy) {
final Optional<DataType> potentialOutputType = outputTypeStrategy.inferType(callContext);
if (potentialOutputType.isEmpty()) {
throw new ValidationException(
"Could not infer an output type for the given arguments.");
}
final DataType outputType = potentialOutputType.get();
if (isUnknown(outputType)) {
throw new ValidationException(
"Could not infer an output type for the given arguments. Untyped NULL received.");
}
return outputType;
} | Infers an output type using the given {@link TypeStrategy}. It assumes that input arguments
have been adapted before if necessary. | inferOutputType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInferenceUtil.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInferenceUtil.java | Apache-2.0 |
public static LinkedHashMap<String, StateInfo> inferStateInfos(
CallContext callContext, LinkedHashMap<String, StateTypeStrategy> stateTypeStrategies) {
return stateTypeStrategies.entrySet().stream()
.map(
e ->
Map.entry(
e.getKey(),
inferStateInfo(callContext, e.getKey(), e.getValue())))
.collect(
Collectors.toMap(
Map.Entry::getKey,
Map.Entry::getValue,
(x, y) -> y,
LinkedHashMap::new));
} | Infers {@link StateInfo}s using the given {@link StateTypeStrategy}s. It assumes that input
arguments have been adapted before if necessary. | inferStateInfos | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInferenceUtil.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInferenceUtil.java | Apache-2.0 |
public static String generateSignature(
TypeInference typeInference, String name, FunctionDefinition definition) {
final List<StaticArgument> staticArguments =
typeInference.getStaticArguments().orElse(null);
if (staticArguments != null) {
return formatStaticArguments(name, staticArguments);
}
return typeInference.getInputTypeStrategy().getExpectedSignatures(definition).stream()
.map(s -> formatSignature(name, s))
.collect(Collectors.joining("\n"));
} | Generates a signature of the given {@link FunctionDefinition}. | generateSignature | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInferenceUtil.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInferenceUtil.java | Apache-2.0 |
public static ValidationException createInvalidInputException(
TypeInference typeInference, CallContext callContext, ValidationException cause) {
return new ValidationException(
String.format(
"Invalid input arguments. Expected signatures are:\n%s",
generateSignature(
typeInference,
callContext.getName(),
callContext.getFunctionDefinition())),
cause);
} | Returns an exception for invalid input arguments. | createInvalidInputException | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInferenceUtil.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInferenceUtil.java | Apache-2.0 |
public static ValidationException createInvalidCallException(
CallContext callContext, ValidationException cause) {
return new ValidationException(
String.format(
"Invalid function call:\n%s(%s)",
callContext.getName(),
callContext.getArgumentDataTypes().stream()
.map(DataType::toString)
.collect(Collectors.joining(", "))),
cause);
} | Returns an exception for an invalid call to a function. | createInvalidCallException | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInferenceUtil.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInferenceUtil.java | Apache-2.0 |
public static TableException createUnexpectedException(
CallContext callContext, Throwable cause) {
return new TableException(
String.format(
"Unexpected error in type inference logic of function '%s'. This is a bug.",
callContext.getName()),
cause);
} | Returns an exception for an unexpected error during type inference. | createUnexpectedException | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInferenceUtil.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInferenceUtil.java | Apache-2.0 |
public static boolean validateArgumentCount(
ArgumentCount argumentCount, int actualCount, boolean throwOnFailure) {
final int minCount = argumentCount.getMinCount().orElse(0);
if (actualCount < minCount) {
if (throwOnFailure) {
throw new ValidationException(
String.format(
"Invalid number of arguments. At least %d arguments expected but %d passed.",
minCount, actualCount));
}
return false;
}
final int maxCount = argumentCount.getMaxCount().orElse(Integer.MAX_VALUE);
if (actualCount > maxCount) {
if (throwOnFailure) {
throw new ValidationException(
String.format(
"Invalid number of arguments. At most %d arguments expected but %d passed.",
maxCount, actualCount));
}
return false;
}
if (!argumentCount.isValidCount(actualCount)) {
if (throwOnFailure) {
throw new ValidationException(
String.format(
"Invalid number of arguments. %d arguments passed.", actualCount));
}
return false;
}
return true;
} | Validates argument counts.
@param argumentCount expected argument count
@param actualCount actual argument count
@param throwOnFailure if true, the function throws a {@link ValidationException} if the
actual value does not meet the expected argument count
@return a boolean indicating if expected argument counts match the actual counts | validateArgumentCount | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInferenceUtil.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInferenceUtil.java | Apache-2.0 |
public static TypeStrategy argument(int pos) {
return new ArgumentMappingTypeStrategy(pos, Optional::of);
} | Type strategy that returns the n-th input argument. | argument | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | Apache-2.0 |
public static TypeStrategy argument(int pos, Function<DataType, Optional<DataType>> mapper) {
return new ArgumentMappingTypeStrategy(pos, mapper);
} | Type strategy that returns the n-th input argument, mapping it. | argument | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | Apache-2.0 |
public static TypeStrategy mapping(Map<InputTypeStrategy, TypeStrategy> mappings) {
return new MappingTypeStrategy(mappings);
} | Type strategy that maps an {@link InputTypeStrategy} to a {@link TypeStrategy} if the input
strategy infers identical types. | mapping | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | Apache-2.0 |
public static TypeStrategy forceNullable(TypeStrategy initialStrategy) {
return new ForceNullableTypeStrategy(initialStrategy);
} | Type strategy which forces the given {@param initialStrategy} to be nullable. | forceNullable | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | Apache-2.0 |
public static TypeStrategy nullableIfArgs(
ConstantArgumentCount includedArgs, TypeStrategy initialStrategy) {
return new NullableIfArgsTypeStrategy(includedArgs, initialStrategy, false);
} | A type strategy that can be used to make a result type nullable if any of the selected input
arguments is nullable. Otherwise the type will be not null. | nullableIfArgs | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | Apache-2.0 |
public static TypeStrategy nullableIfArgs(TypeStrategy initialStrategy) {
return nullableIfArgs(ConstantArgumentCount.any(), initialStrategy);
} | A type strategy that can be used to make a result type nullable if any of the input arguments
is nullable. Otherwise the type will be not null. | nullableIfArgs | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | Apache-2.0 |
public static TypeStrategy nullableIfAllArgs(
ConstantArgumentCount includedArgs, TypeStrategy initialStrategy) {
return new NullableIfArgsTypeStrategy(includedArgs, initialStrategy, true);
} | A type strategy that can be used to make a result type nullable if all the selected input
arguments are nullable. Otherwise the type will be non-nullable. | nullableIfAllArgs | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | Apache-2.0 |
public static TypeStrategy nullableIfAllArgs(TypeStrategy initialStrategy) {
return nullableIfAllArgs(ConstantArgumentCount.any(), initialStrategy);
} | A type strategy that can be used to make a result type nullable if all the input arguments is
nullable. Otherwise the type will be not null. | nullableIfAllArgs | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | Apache-2.0 |
public static TypeStrategy varyingString(TypeStrategy initialStrategy) {
return new VaryingStringTypeStrategy(initialStrategy);
} | A type strategy that ensures that the result type is either {@link LogicalTypeRoot#VARCHAR}
or {@link LogicalTypeRoot#VARBINARY} from their corresponding non-varying roots. | varyingString | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | Apache-2.0 |
public static TypeStrategy aggArg0(
Function<LogicalType, LogicalType> aggType, boolean nullableIfGroupingEmpty) {
return callContext -> {
final DataType argDataType = callContext.getArgumentDataTypes().get(0);
final LogicalType argType = argDataType.getLogicalType();
LogicalType result = aggType.apply(argType);
if (nullableIfGroupingEmpty && !callContext.isGroupedAggregation()) {
// null only if condition is met, otherwise arguments nullability
result = result.copy(true);
} else if (!nullableIfGroupingEmpty) {
// never null
result = result.copy(false);
}
return Optional.of(fromLogicalToDataType(result));
};
} | Type strategy specific for aggregations that partially produce different nullability
depending whether the result is grouped or not. | aggArg0 | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeStrategies.java | Apache-2.0 |
default DataType transform(@Nullable DataTypeFactory factory, DataType typeToTransform) {
return transform(typeToTransform);
} | Transforms the given data type to a different data type.
<p>This method provides a {@link DataTypeFactory} if available. | transform | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeTransformation.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeTransformation.java | Apache-2.0 |
public static TypeTransformation timeToSqlTypes() {
Map<LogicalTypeRoot, Class<?>> conversions = new HashMap<>();
conversions.put(LogicalTypeRoot.TIMESTAMP_WITHOUT_TIME_ZONE, Timestamp.class);
conversions.put(LogicalTypeRoot.TIME_WITHOUT_TIME_ZONE, Time.class);
conversions.put(LogicalTypeRoot.DATE, Date.class);
return new DataTypeConversionClassTransformation(conversions);
} | Returns a type transformation that transforms data type to a new data type whose conversion
class is {@link java.sql.Timestamp}/{@link java.sql.Time}/{@link java.sql.Date} if the
original data type is TIMESTAMP/TIME/DATE. | timeToSqlTypes | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeTransformations.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeTransformations.java | Apache-2.0 |
public static TypeTransformation legacyRawToTypeInfoRaw() {
return LegacyRawTypeTransformation.INSTANCE;
} | Returns a type transformation that transforms LEGACY('RAW', ...) type to the RAW(..., ?)
type. | legacyRawToTypeInfoRaw | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeTransformations.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeTransformations.java | Apache-2.0 |
public static TypeTransformation legacyToNonLegacy() {
return LegacyToNonLegacyTransformation.INSTANCE;
} | Returns a type transformation that transforms LEGACY(...) type to a non-legacy type. | legacyToNonLegacy | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeTransformations.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeTransformations.java | Apache-2.0 |
public static TypeTransformation toNullable() {
return DataType::nullable;
} | Returns a type transformation that transforms data type to nullable data type but keeps other
information unchanged. | toNullable | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeTransformations.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeTransformations.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
final List<DataType> argumentDataTypes = callContext.getArgumentDataTypes();
final LogicalType addend1 = argumentDataTypes.get(0).getLogicalType();
final LogicalType addend2 = argumentDataTypes.get(1).getLogicalType();
Preconditions.checkArgument(addend1.is(LogicalTypeRoot.DECIMAL), ERROR_MSG);
Preconditions.checkArgument(addend2.is(LogicalTypeRoot.DECIMAL), ERROR_MSG);
return Optional.of(fromLogicalToDataType(LogicalTypeMerging.findSumAggType(addend2)));
} | Type strategy that returns the result type of a decimal addition, used internally for
implementing SUM/AVG aggregations (with and without retractions) on a Decimal type. Uses the
{@link LogicalTypeMerging#findSumAggType(LogicalType)} and prevents the {@link
DecimalPlusTypeStrategy} from overriding the special calculation for precision and scale needed
by the aggregate function. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/AggDecimalPlusTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/AggDecimalPlusTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
final List<DataType> argumentDataTypes = callContext.getArgumentDataTypes();
if (pos >= argumentDataTypes.size()) {
return Optional.empty();
}
return mapper.apply(argumentDataTypes.get(pos));
} | Type strategy that returns the n-th input argument, mapping it with the provided function. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/ArgumentMappingTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/ArgumentMappingTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
final List<DataType> argumentDataTypes = callContext.getArgumentDataTypes();
final DataType arrayDataType = argumentDataTypes.get(0);
final DataType elementToAddDataType = argumentDataTypes.get(1);
final LogicalType arrayElementLogicalType =
arrayDataType.getLogicalType().getChildren().get(0);
if (elementToAddDataType.getLogicalType().isNullable()
&& !arrayElementLogicalType.isNullable()) {
return Optional.of(
DataTypes.ARRAY(fromLogicalToDataType(arrayElementLogicalType).nullable()));
}
return Optional.of(arrayDataType);
} | Type strategy that returns a {@link DataTypes#ARRAY(DataType)} with element type equal to the
type of the first argument if it's not nullable or element to add is not nullable, otherwise it
returns {@link DataTypes#ARRAY(DataType)} with type equal to the type of the element to add to
array. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/ArrayAppendPrependTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/ArrayAppendPrependTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
DataType arrayType = callContext.getArgumentDataTypes().get(0);
final Optional<DataType> legacyArrayElement =
StrategyUtils.extractLegacyArrayElement(arrayType);
if (legacyArrayElement.isPresent()) {
return legacyArrayElement;
}
if (!arrayType.getLogicalType().is(LogicalTypeFamily.COLLECTION)) {
return Optional.empty();
}
return Optional.of(((CollectionDataType) arrayType).getElementDataType().nullable());
} | Returns the element of an {@link LogicalTypeFamily#COLLECTION} type. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/ArrayElementTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/ArrayElementTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
List<DataType> argumentDataTypes = callContext.getArgumentDataTypes();
if (argumentDataTypes.size() < 1) {
return Optional.empty();
}
return Optional.of(DataTypes.ARRAY(argumentDataTypes.get(0)).notNull());
} | Type strategy that returns a {@link DataTypes#ARRAY(DataType)} with element type equal to the
type of the first argument. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/ArrayTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/ArrayTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
List<DataType> argumentDataTypes = callContext.getArgumentDataTypes();
if (argumentDataTypes.size() != 1) {
return Optional.empty();
}
return Optional.of(DataTypes.MULTISET(argumentDataTypes.get(0)).notNull());
} | Type strategy that returns a {@link DataTypes#MULTISET(DataType)} with element type equal to the
type of the first argument. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/CollectTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/CollectTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
final LogicalType inputType = callContext.getArgumentDataTypes().get(0).getLogicalType();
if (inputType.is(LogicalTypeRoot.TIMESTAMP_WITHOUT_TIME_ZONE)) {
return Optional.of(DataTypes.TIMESTAMP(3));
} else if (inputType.is(LogicalTypeRoot.TIMESTAMP_WITH_LOCAL_TIME_ZONE)) {
return Optional.of(DataTypes.TIMESTAMP_LTZ(3));
}
return Optional.empty();
} | Type strategy for {@link BuiltInFunctionDefinitions#CURRENT_WATERMARK} which mirrors the type of
the passed rowtime column, but removes the rowtime kind and enforces the correct precision for
watermarks. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/CurrentWatermarkTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/CurrentWatermarkTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
final List<DataType> argumentDataTypes = callContext.getArgumentDataTypes();
final LogicalType dividend = argumentDataTypes.get(0).getLogicalType();
final LogicalType divisor = argumentDataTypes.get(1).getLogicalType();
// a hack to make legacy types possible until we drop them
if (dividend instanceof LegacyTypeInformationType) {
return Optional.of(argumentDataTypes.get(0));
}
if (divisor instanceof LegacyTypeInformationType) {
return Optional.of(argumentDataTypes.get(1));
}
if (!isDecimalComputation(dividend, divisor)) {
return Optional.empty();
}
final DecimalType decimalType =
LogicalTypeMerging.findDivisionDecimalType(
getPrecision(dividend),
getScale(dividend),
getPrecision(divisor),
getScale(divisor));
return Optional.of(fromLogicalToDataType(decimalType));
} | Type strategy that returns the quotient of an exact numeric division that includes at least one
decimal. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/DecimalDivideTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/DecimalDivideTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
final List<DataType> argumentDataTypes = callContext.getArgumentDataTypes();
final LogicalType dividend = argumentDataTypes.get(0).getLogicalType();
final LogicalType divisor = argumentDataTypes.get(1).getLogicalType();
// a hack to make legacy types possible until we drop them
if (dividend instanceof LegacyTypeInformationType) {
return Optional.of(argumentDataTypes.get(0));
}
if (divisor instanceof LegacyTypeInformationType) {
return Optional.of(argumentDataTypes.get(1));
}
if (!isDecimalComputation(dividend, divisor)) {
return Optional.empty();
}
final int dividendScale = getScale(dividend);
final int divisorScale = getScale(divisor);
if (dividendScale == 0 && divisorScale == 0) {
return Optional.of(argumentDataTypes.get(1));
}
final DecimalType decimalType =
LogicalTypeMerging.findModuloDecimalType(
getPrecision(dividend), dividendScale, getPrecision(divisor), divisorScale);
return Optional.of(fromLogicalToDataType(decimalType));
} | Type strategy that returns the modulo of an exact numeric division that includes at least one
decimal. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/DecimalModTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/DecimalModTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
final List<DataType> argumentDataTypes = callContext.getArgumentDataTypes();
final LogicalType addend1 = argumentDataTypes.get(0).getLogicalType();
final LogicalType addend2 = argumentDataTypes.get(1).getLogicalType();
// a hack to make legacy types possible until we drop them
if (addend1 instanceof LegacyTypeInformationType) {
return Optional.of(argumentDataTypes.get(0));
}
if (addend2 instanceof LegacyTypeInformationType) {
return Optional.of(argumentDataTypes.get(1));
}
if (!isDecimalComputation(addend1, addend2)) {
return Optional.empty();
}
final DecimalType decimalType =
LogicalTypeMerging.findAdditionDecimalType(
getPrecision(addend1),
getScale(addend1),
getPrecision(addend2),
getScale(addend2));
return Optional.of(fromLogicalToDataType(decimalType));
} | Type strategy that returns the sum of an exact numeric addition that includes at least one
decimal. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/DecimalPlusTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/DecimalPlusTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
final DataType argumentDataType = callContext.getArgumentDataTypes().get(0);
final LogicalType argumentType = argumentDataType.getLogicalType();
// a hack to make legacy types possible until we drop them
if (argumentType instanceof LegacyTypeInformationType) {
return Optional.of(argumentDataType);
}
if (argumentType.is(LogicalTypeRoot.DECIMAL)) {
if (hasScale(argumentType, 0)) {
return Optional.of(argumentDataType);
}
final LogicalType inferredType =
new DecimalType(argumentType.isNullable(), getPrecision(argumentType), 0);
return Optional.of(fromLogicalToDataType(inferredType));
}
return Optional.empty();
} | Strategy that returns a decimal type but with a scale of 0. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/DecimalScale0TypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/DecimalScale0TypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
final List<DataType> argumentDataTypes = callContext.getArgumentDataTypes();
final LogicalType factor1 = argumentDataTypes.get(0).getLogicalType();
final LogicalType factor2 = argumentDataTypes.get(1).getLogicalType();
// a hack to make legacy types possible until we drop them
if (factor1 instanceof LegacyTypeInformationType) {
return Optional.of(argumentDataTypes.get(0));
}
if (factor2 instanceof LegacyTypeInformationType) {
return Optional.of(argumentDataTypes.get(1));
}
if (!isDecimalComputation(factor1, factor2)) {
return Optional.empty();
}
final DecimalType decimalType =
LogicalTypeMerging.findMultiplicationDecimalType(
getPrecision(factor1),
getScale(factor1),
getPrecision(factor2),
getScale(factor2));
return Optional.of(fromLogicalToDataType(decimalType));
} | Type strategy that returns the product of an exact numeric multiplication that includes at least
one decimal. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/DecimalTimesTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/DecimalTimesTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
return initialStrategy.inferType(callContext).map(DataType::nullable);
} | Forces a given type strategy to be nullable. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/ForceNullableTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/ForceNullableTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
List<DataType> argumentDataTypes = callContext.getArgumentDataTypes();
DataType rowDataType = argumentDataTypes.get(0);
Optional<DataType> result = Optional.empty();
Optional<String> fieldName = callContext.getArgumentValue(1, String.class);
if (fieldName.isPresent()) {
result = DataTypeUtils.getField(rowDataType, fieldName.get());
}
Optional<Integer> fieldIndex = callContext.getArgumentValue(1, Integer.class);
if (fieldIndex.isPresent()) {
result = DataTypeUtils.getField(rowDataType, fieldIndex.get());
}
return result.map(
type -> {
if (rowDataType.getLogicalType().isNullable()) {
return type.nullable();
} else {
return type;
}
});
} | Type strategy that returns a type of a field nested inside a composite type that is described by
the second argument. The second argument must be a literal that describes either the nested field
name or index. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/GetTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/GetTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
final List<DataType> argumentDataTypes = callContext.getArgumentDataTypes();
final LogicalType addend1 = argumentDataTypes.get(0).getLogicalType();
final LogicalType addend2 = argumentDataTypes.get(1).getLogicalType();
Preconditions.checkArgument(addend1.is(LogicalTypeRoot.DECIMAL), ERROR_MSG);
Preconditions.checkArgument(addend2.is(LogicalTypeRoot.DECIMAL), ERROR_MSG);
return Optional.of(fromLogicalToDataType(addend2));
} | Type strategy that returns the result type of a decimal addition, used internally for
implementing native SUM/AVG aggregations on a Decimal type. Here is used to prevent the normal
{@link DecimalPlusTypeStrategy} from overriding the special calculation for precision and scale
needed by the aggregate function. {@link LogicalTypeMerging#findAdditionDecimalType} will adjust
the precision according to the two input arguments, but for hive we just keep the precision as
input type because of the input type precision is the same. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/HiveAggDecimalPlusTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/HiveAggDecimalPlusTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
final List<DataType> argumentDataTypes = callContext.getArgumentDataTypes();
final DataType inputDataType = argumentDataTypes.get(0);
final DataType nullReplacementDataType = argumentDataTypes.get(1);
if (!inputDataType.getLogicalType().isNullable()) {
return Optional.of(inputDataType);
}
return LogicalTypeMerging.findCommonType(
Arrays.asList(
inputDataType.getLogicalType(),
nullReplacementDataType.getLogicalType()))
.map(t -> t.copy(nullReplacementDataType.getLogicalType().isNullable()))
.map(TypeConversions::fromLogicalToDataType);
} | Type strategy specific for avoiding nulls. <br>
If arg0 is non-nullable, output datatype is exactly the datatype of arg0. Otherwise, output
datatype is the common type of arg0 and arg1. In the second case, output type is nullable only if
both args are nullable. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/IfNullTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/IfNullTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
DataType arrayOrMapType = callContext.getArgumentDataTypes().get(0);
final Optional<DataType> legacyArrayElement =
StrategyUtils.extractLegacyArrayElement(arrayOrMapType);
if (legacyArrayElement.isPresent()) {
return legacyArrayElement;
}
if (arrayOrMapType.getLogicalType().is(LogicalTypeRoot.ARRAY)) {
return Optional.of(
((CollectionDataType) arrayOrMapType).getElementDataType().nullable());
} else if (arrayOrMapType instanceof KeyValueDataType) {
return Optional.of(((KeyValueDataType) arrayOrMapType).getValueDataType().nullable());
}
return Optional.empty();
} | An output type strategy for {@link BuiltInFunctionDefinitions#AT}.
<p>Returns either the element of an {@link LogicalTypeFamily#COLLECTION} type or the value of
{@link LogicalTypeRoot#MAP}. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/ItemAtTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/ItemAtTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
List<DataType> argumentDataTypes = callContext.getArgumentDataTypes();
if (argumentDataTypes.size() < 2) {
return Optional.empty();
}
return Optional.of(
DataTypes.MAP(argumentDataTypes.get(0), argumentDataTypes.get(1)).notNull());
} | Type strategy that returns a {@link DataTypes#MAP(DataType, DataType)} with a key type equal to
type of the first argument and a value type equal to the type of second argument. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/MapTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/MapTypeStrategy.java | Apache-2.0 |
private static @Nullable Integer commonMin(List<ArgumentCount> counts) {
// min=5, min=3, min=0 -> min=0
// min=5, min=3, min=0, min=null -> min=null
int commonMin = Integer.MAX_VALUE;
for (ArgumentCount count : counts) {
final Optional<Integer> min = count.getMinCount();
if (!min.isPresent()) {
return null;
}
commonMin = Math.min(commonMin, min.get());
}
if (commonMin == Integer.MAX_VALUE) {
return null;
}
return commonMin;
} | Returns the common minimum argument count or null if undefined. | commonMin | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/OrInputTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/OrInputTypeStrategy.java | Apache-2.0 |
private static @Nullable Integer commonMax(List<ArgumentCount> counts) {
// max=5, max=3, max=0 -> max=5
// max=5, max=3, max=0, max=null -> max=null
int commonMax = Integer.MIN_VALUE;
for (ArgumentCount count : counts) {
final Optional<Integer> max = count.getMaxCount();
if (!max.isPresent()) {
return null;
}
commonMax = Math.max(commonMax, max.get());
}
if (commonMax == Integer.MIN_VALUE) {
return null;
}
return commonMax;
} | Returns the common maximum argument count or null if undefined. | commonMax | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/OrInputTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/OrInputTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
final List<DataType> argumentDataTypes = callContext.getArgumentDataTypes();
final DataType argumentDataType = callContext.getArgumentDataTypes().get(0);
final LogicalType argumentType = argumentDataType.getLogicalType();
// a hack to make legacy types possible until we drop them
if (argumentType instanceof LegacyTypeInformationType) {
return Optional.of(argumentDataType);
}
if (!argumentType.is(LogicalTypeRoot.DECIMAL)) {
return Optional.of(argumentDataType);
}
final BigDecimal roundLength;
if (argumentDataTypes.size() == 2) {
if (!callContext.isArgumentLiteral(1) || callContext.isArgumentNull(1)) {
return Optional.of(argumentDataType);
}
roundLength =
callContext
.getArgumentValue(1, BigDecimal.class)
.orElseThrow(AssertionError::new);
} else {
roundLength = BigDecimal.ZERO;
}
final LogicalType inferredType =
LogicalTypeMerging.findRoundDecimalType(
getPrecision(argumentType),
getScale(argumentType),
roundLength.intValueExact());
return Optional.of(fromLogicalToDataType(inferredType));
} | Type strategy that returns the result of a rounding operation. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/RoundTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/RoundTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
final DataType dataType = callContext.getArgumentDataTypes().get(0);
final LogicalType inputType = dataType.getLogicalType();
if (inputType.is(LogicalTypeRoot.TIMESTAMP_WITHOUT_TIME_ZONE)
|| inputType.is(LogicalTypeRoot.TIMESTAMP_WITH_LOCAL_TIME_ZONE)) {
return Optional.of(dataType);
} else if (inputType.is(LogicalTypeRoot.BIGINT)) {
final DataType timestampType = DataTypes.TIMESTAMP(3);
if (dataType.getLogicalType().isNullable()) {
return Optional.of(timestampType.nullable());
} else {
return Optional.of(timestampType.notNull());
}
}
return Optional.empty();
} | Type strategy for {@link BuiltInFunctionDefinitions#ROWTIME} which mirrors the type of the passed
rowtime column, but returns {@link LogicalTypeRoot#TIMESTAMP_WITHOUT_TIME_ZONE} for a {@code
BATCH} mode. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/RowtimeTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/RowtimeTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
List<DataType> argumentDataTypes = callContext.getArgumentDataTypes();
DataTypes.Field[] fields =
IntStream.range(0, argumentDataTypes.size())
.mapToObj(idx -> DataTypes.FIELD("f" + idx, argumentDataTypes.get(idx)))
.toArray(DataTypes.Field[]::new);
return Optional.of(DataTypes.ROW(fields).notNull());
} | Type strategy that returns a {@link DataTypes#ROW()} with fields types equal to input types. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/RowTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/RowTypeStrategy.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
final DataType timestampDataType =
callContext
.getOutputDataType()
.filter(dt -> dt.getLogicalType().is(LogicalTypeFamily.TIMESTAMP))
.orElse(DataTypes.TIMESTAMP_LTZ(3));
return Optional.of(timestampDataType);
} | Type strategy specific for source watermarks that depend on the output type. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/SourceWatermarkTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/SourceWatermarkTypeStrategy.java | Apache-2.0 |
public static ArgumentTypeStrategy percentageArray(boolean expectedNullability) {
return new PercentageArrayArgumentTypeStrategy(expectedNullability);
} | An {@link ArgumentTypeStrategy} that expects an array of percentages with each element
between [0.0, 1.0]. | percentageArray | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/SpecificInputTypeStrategies.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/SpecificInputTypeStrategies.java | Apache-2.0 |
@Override
public Optional<DataType> inferType(CallContext callContext) {
final List<DataType> argumentDataTypes = callContext.getArgumentDataTypes();
final LogicalType type1 = argumentDataTypes.get(0).getLogicalType();
final LogicalType type2 = argumentDataTypes.get(1).getLogicalType();
int length = getLength(type1) + getLength(type2);
// handle overflow
if (length < 0) {
length = CharType.MAX_LENGTH;
}
final LogicalType minimumType;
if (type1.is(LogicalTypeFamily.CHARACTER_STRING)
|| type2.is(LogicalTypeFamily.CHARACTER_STRING)) {
minimumType = new CharType(false, length);
} else if (type1.is(LogicalTypeFamily.BINARY_STRING)
|| type2.is(LogicalTypeFamily.BINARY_STRING)) {
minimumType = new BinaryType(false, length);
} else {
return Optional.empty();
}
// deal with nullability handling and varying semantics
return findCommonType(Arrays.asList(type1, type2, minimumType))
.map(TypeConversions::fromLogicalToDataType);
} | Type strategy that returns the type of a string concatenation. It assumes that the first two
arguments are of the same family of either {@link LogicalTypeFamily#BINARY_STRING} or {@link
LogicalTypeFamily#CHARACTER_STRING}. | inferType | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/StringConcatTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/StringConcatTypeStrategy.java | Apache-2.0 |
public SubsequenceStrategyBuilder argument(ArgumentTypeStrategy argumentTypeStrategy) {
SequenceInputTypeStrategy singleArgumentStrategy =
new SequenceInputTypeStrategy(
Collections.singletonList(argumentTypeStrategy), null);
argumentsSplits.add(
new ArgumentsSplit(currentPos, currentPos + 1, singleArgumentStrategy));
currentPos += 1;
return this;
} | Defines that we expect a single argument at the next position. | argument | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/SubsequenceInputTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/SubsequenceInputTypeStrategy.java | Apache-2.0 |
public SubsequenceStrategyBuilder argument(
String argumentName, ArgumentTypeStrategy argumentTypeStrategy) {
SequenceInputTypeStrategy singleArgumentStrategy =
new SequenceInputTypeStrategy(
Collections.singletonList(argumentTypeStrategy),
Collections.singletonList(argumentName));
argumentsSplits.add(
new ArgumentsSplit(currentPos, currentPos + 1, singleArgumentStrategy));
currentPos += 1;
return this;
} | Defines that we expect a single named argument at the next position. | argument | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/SubsequenceInputTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/SubsequenceInputTypeStrategy.java | Apache-2.0 |
public SubsequenceStrategyBuilder subsequence(InputTypeStrategy inputTypeStrategy) {
Preconditions.checkArgument(
inputTypeStrategy.getArgumentCount() instanceof ConstantArgumentCount);
Optional<Integer> maxCount = inputTypeStrategy.getArgumentCount().getMaxCount();
Optional<Integer> minCount = inputTypeStrategy.getArgumentCount().getMinCount();
if (!maxCount.isPresent()
|| !minCount.isPresent()
|| !maxCount.get().equals(minCount.get())) {
throw new IllegalArgumentException(
"Both the minimum and maximum number of expected arguments must"
+ " be defined and equal to each other.");
}
argumentsSplits.add(
new ArgumentsSplit(currentPos, currentPos + maxCount.get(), inputTypeStrategy));
currentPos += maxCount.get();
return this;
} | Defines a common {@link InputTypeStrategy} for the next arguments. Given input strategy
must expect a constant number of arguments. That means that both the minimum and maximum
number of arguments must be defined and equal to each other.
<p>If you need a varying logic use {@link #finishWithVarying(InputTypeStrategy)}. | subsequence | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/SubsequenceInputTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/SubsequenceInputTypeStrategy.java | Apache-2.0 |
public InputTypeStrategy finishWithVarying(InputTypeStrategy inputTypeStrategy) {
final ArgumentCount strategyArgumentCount = inputTypeStrategy.getArgumentCount();
strategyArgumentCount
.getMaxCount()
.ifPresent(
c -> {
throw new IllegalArgumentException(
"The maximum number of arguments must not be defined.");
});
argumentsSplits.add(new ArgumentsSplit(currentPos, null, inputTypeStrategy));
final int minCount = currentPos + strategyArgumentCount.getMinCount().orElse(0);
return new SubsequenceInputTypeStrategy(
argumentsSplits, ConstantArgumentCount.from(minCount));
} | Defines a common {@link InputTypeStrategy} for the next arguments. Given input strategy
must expect a varying number of arguments. That means that the maximum number of
arguments must not be defined. | finishWithVarying | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/SubsequenceInputTypeStrategy.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/strategies/SubsequenceInputTypeStrategy.java | Apache-2.0 |
@Override
public DataType transform(DataType dataType) {
LogicalType logicalType = dataType.getLogicalType();
Class<?> conversionClass = conversions.get(logicalType.getTypeRoot());
if (conversionClass != null) {
return dataType.bridgedTo(conversionClass);
} else {
return dataType;
}
} | This type transformation transforms the specified data types to a new one with the expected
conversion class. The mapping from data type to conversion class is defined by the constructor
parameter {@link #conversions} map that maps from type root to the expected conversion class. | transform | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/transforms/DataTypeConversionClassTransformation.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/transforms/DataTypeConversionClassTransformation.java | Apache-2.0 |
@Override
public DataType transform(DataType typeToTransform) {
LogicalType logicalType = typeToTransform.getLogicalType();
if (logicalType instanceof LegacyTypeInformationType
&& logicalType.getTypeRoot() == LogicalTypeRoot.RAW) {
TypeInformation<?> typeInfo =
((LegacyTypeInformationType<?>) logicalType).getTypeInformation();
DataType rawDataType = new AtomicDataType(new TypeInformationRawType<>(typeInfo));
return logicalType.isNullable() ? rawDataType : rawDataType.notNull();
}
return typeToTransform;
} | This type transformation transforms the LEGACY('RAW', ...) type to the RAW(..., ?) type. | transform | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/transforms/LegacyRawTypeTransformation.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/transforms/LegacyRawTypeTransformation.java | Apache-2.0 |
public boolean isNullable() {
return isNullable;
} | Returns whether a value of this type can be {@code null}. | isNullable | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalType.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalType.java | Apache-2.0 |
public LogicalTypeRoot getTypeRoot() {
return typeRoot;
} | Returns the root of this type. It is an essential description without additional parameters. | getTypeRoot | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalType.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalType.java | Apache-2.0 |
public boolean is(LogicalTypeRoot typeRoot) {
return this.typeRoot == typeRoot;
} | Returns whether the root of the type equals to the {@code typeRoot} or not.
@param typeRoot The root type to check against for equality | is | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalType.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalType.java | Apache-2.0 |
public boolean isAnyOf(LogicalTypeRoot... typeRoots) {
return Arrays.stream(typeRoots).anyMatch(tr -> this.typeRoot == tr);
} | Returns whether the root of the type equals to at least on of the {@code typeRoots} or not.
@param typeRoots The root types to check against for equality | isAnyOf | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalType.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalType.java | Apache-2.0 |
public boolean isAnyOf(LogicalTypeFamily... typeFamilies) {
return Arrays.stream(typeFamilies).anyMatch(tf -> this.typeRoot.getFamilies().contains(tf));
} | Returns whether the root of the type is part of at least one family of the {@code typeFamily}
or not.
@param typeFamilies The families to check against for equality | isAnyOf | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalType.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalType.java | Apache-2.0 |
public boolean is(LogicalTypeFamily family) {
return typeRoot.getFamilies().contains(family);
} | Returns whether the family type of the type equals to the {@code family} or not.
@param family The family type to check against for equality | is | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalType.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalType.java | Apache-2.0 |
public final LogicalType copy() {
return copy(isNullable);
} | Returns a deep copy of this type. It requires an implementation of {@link #copy(boolean)}.
@return a deep copy | copy | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalType.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalType.java | Apache-2.0 |
public String asSummaryString() {
return asSerializableString();
} | Returns a string that summarizes this type for printing to a console. An implementation might
shorten long names or skips very specific properties.
<p>Use {@link #asSerializableString()} for a type string that fully serializes this instance.
@return summary string of this type for debugging purposes | asSummaryString | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalType.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalType.java | Apache-2.0 |
public Set<LogicalTypeFamily> getFamilies() {
return families;
} | An enumeration of logical type roots containing static information about logical data types.
<p>A root is an essential description of a {@link LogicalType} without additional parameters. For
example, a parameterized logical type {@code DECIMAL(12,3)} possesses all characteristics of its
root {@code DECIMAL}. Additionally, a logical type root enables efficient comparison during the
evaluation of types.
<p>The enumeration is very close to the SQL standard in terms of naming and completeness.
However, it reflects just a subset of the evolving standard and contains some extensions (such as
{@code NULL}, {@code SYMBOL}, or {@code RAW}).
<p>See the type-implementing classes for a more detailed description of each type.
<p>Note to implementers: Whenever we perform a match against a type root (e.g. using a
switch/case statement), it is recommended to:
<ul>
<li>Order the items by the type root definition in this class for easy readability.
<li>Think about the behavior of all type roots for the implementation. A default fallback is
dangerous when introducing a new type root in the future.
<li>In many <b>runtime</b> cases, resolve the indirection of {@link #DISTINCT_TYPE}: {@code
return myMethod(((DistinctType) type).getSourceType)}
</ul> | getFamilies | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalTypeRoot.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalTypeRoot.java | Apache-2.0 |
default R visit(DescriptorType descriptorType) {
return visit((LogicalType) descriptorType);
} | The visitor definition of {@link LogicalType}. The visitor transforms a logical type into
instances of {@code R}.
<p>Incomplete types such as the {@link TypeInformationRawType} or {@link
UnresolvedUserDefinedType} are visited through the generic {@link #visit(LogicalType)}.
@param <R> result type | visit | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalTypeVisitor.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/LogicalTypeVisitor.java | Apache-2.0 |
@SuppressWarnings({"unchecked", "rawtypes"})
public static RawType<?> restore(
ClassLoader classLoader, String className, String serializerString) {
try {
final Class<?> clazz = Class.forName(className, true, classLoader);
final byte[] bytes = EncodingUtils.decodeBase64ToBytes(serializerString);
final DataInputDeserializer inputDeserializer = new DataInputDeserializer(bytes);
final TypeSerializerSnapshot<?> snapshot =
TypeSerializerSnapshot.readVersionedSnapshot(inputDeserializer, classLoader);
return (RawType<?>) new RawType(clazz, snapshot.restoreSerializer());
} catch (Throwable t) {
throw new ValidationException(
String.format(
"Unable to restore the RAW type of class '%s' with serializer snapshot '%s'.",
className, serializerString),
t);
}
} | Restores a raw type from the components of a serialized string representation. | restore | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/RawType.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/RawType.java | Apache-2.0 |
public String getSerializerString() {
if (serializerString == null) {
final DataOutputSerializer outputSerializer = new DataOutputSerializer(128);
try {
TypeSerializerSnapshot.writeVersionedSnapshot(
outputSerializer, serializer.snapshotConfiguration());
serializerString =
EncodingUtils.encodeBytesToBase64(outputSerializer.getCopyOfBuffer());
return serializerString;
} catch (Exception e) {
throw new TableException(
String.format(
"Unable to generate a string representation of the serializer snapshot of '%s' "
+ "describing the class '%s' for the RAW type.",
serializer.getClass().getName(), clazz.toString()),
e);
}
}
return serializerString;
} | Returns the serialized {@link TypeSerializerSnapshot} in Base64 encoding of this raw type. | getSerializerString | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/RawType.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/RawType.java | Apache-2.0 |
public static StructuredType.Builder newBuilder(ObjectIdentifier objectIdentifier) {
return new StructuredType.Builder(objectIdentifier);
} | Creates a builder for a {@link StructuredType} that is identified by an {@link
ObjectIdentifier}. | newBuilder | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/StructuredType.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/StructuredType.java | Apache-2.0 |
public static StructuredType.Builder newBuilder(
ObjectIdentifier objectIdentifier, Class<?> implementationClass) {
return new StructuredType.Builder(objectIdentifier, implementationClass);
} | Creates a builder for a {@link StructuredType} that identified by an {@link ObjectIdentifier}
but with a resolved implementation class. | newBuilder | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/StructuredType.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/StructuredType.java | Apache-2.0 |
public static StructuredType.Builder newBuilder(Class<?> implementationClass) {
return new StructuredType.Builder(implementationClass);
} | Creates a builder for a {@link StructuredType} that is identified by a class name derived
from the given implementation class. | newBuilder | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/StructuredType.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/StructuredType.java | Apache-2.0 |
public static Optional<Class<?>> resolveClass(ClassLoader classLoader, String className) {
checkClassName(className);
try {
// Initialization is deferred until first instantiation
return Optional.of(Class.forName(className, false, classLoader));
} catch (Throwable t) {
return Optional.empty();
}
} | Restores an implementation class from the class name component of a serialized string
representation.
<p>Note: This method does not perform any kind of validation. The logical type system should
not be destabilized by incorrectly implemented classes. This is also why classes won't get
initialized. At this stage, only the class existence (i.e. metadata) in classloader matters. | resolveClass | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/StructuredType.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/StructuredType.java | Apache-2.0 |
public static boolean supportsAvoidingCast(LogicalType sourceType, LogicalType targetType) {
final CastAvoidanceChecker checker = new CastAvoidanceChecker(sourceType);
return targetType.accept(checker);
} | Returns whether the source type can be safely interpreted as the target type. This allows
avoiding casts by ignoring some logical properties. This is basically a relaxed {@link
LogicalType#equals(Object)}.
<p>In particular this means:
<p>Atomic, non-string types (INT, BOOLEAN, ...) and user-defined structured types must be
fully equal (i.e. {@link LogicalType#equals(Object)}). However, a NOT NULL type can be stored
in NULL type but not vice versa.
<p>Atomic, string types must be contained in the target type (e.g. CHAR(2) is contained in
VARCHAR(3), but VARCHAR(2) is not contained in CHAR(3)). Same for binary strings.
<p>Constructed types (ARRAY, ROW, MAP, etc.) and user-defined distinct type must be of same
kind but ignore field names and other logical attributes. Structured and row kinds are
compatible. However, all the children types ({@link LogicalType#getChildren()}) must be
compatible. | supportsAvoidingCast | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/utils/LogicalTypeCasts.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/utils/LogicalTypeCasts.java | Apache-2.0 |
public static boolean supportsImplicitCast(LogicalType sourceType, LogicalType targetType) {
return supportsCasting(sourceType, targetType, false);
} | Returns whether the source type can be safely casted to the target type without loosing
information.
<p>Implicit casts are used for type widening and type generalization (finding a common
supertype for a set of types). Implicit casts are similar to the Java semantics (e.g. this is
not possible: {@code int x = (String) z}). | supportsImplicitCast | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/utils/LogicalTypeCasts.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/utils/LogicalTypeCasts.java | Apache-2.0 |
public static boolean supportsExplicitCast(LogicalType sourceType, LogicalType targetType) {
return supportsCasting(sourceType, targetType, true);
} | Returns whether the source type can be casted to the target type.
<p>Explicit casts correspond to the SQL cast specification and represent the logic behind a
{@code CAST(sourceType AS targetType)} operation. For example, it allows for converting most
types of the {@link LogicalTypeFamily#PREDEFINED} family to types of the {@link
LogicalTypeFamily#CHARACTER_STRING} family. | supportsExplicitCast | java | apache/flink | flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/utils/LogicalTypeCasts.java | https://github.com/apache/flink/blob/master/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/utils/LogicalTypeCasts.java | Apache-2.0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.