code
stringlengths 25
201k
| docstring
stringlengths 19
96.2k
| func_name
stringlengths 0
235
| language
stringclasses 1
value | repo
stringlengths 8
51
| path
stringlengths 11
314
| url
stringlengths 62
377
| license
stringclasses 7
values |
|---|---|---|---|---|---|---|---|
public static Deadline now() {
return new Deadline(System.nanoTime(), SystemClock.getInstance());
}
|
Constructs a {@link Deadline} that has now as the deadline. Use this and then extend via
{@link #plus(Duration)} to specify a deadline in the future.
|
now
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/time/Deadline.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/time/Deadline.java
|
Apache-2.0
|
public static Deadline fromNow(Duration duration) {
return new Deadline(
addHandlingOverflow(System.nanoTime(), duration.toNanos()),
SystemClock.getInstance());
}
|
Constructs a Deadline that is a given duration after now.
|
fromNow
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/time/Deadline.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/time/Deadline.java
|
Apache-2.0
|
@PublicEvolving
public Class<?> getComponentClass() {
return this.arrayClass.getComponentType();
}
|
Gets the class that represents the component type.
@return The class of the component type.
|
getComponentClass
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeinfo/PrimitiveArrayTypeInfo.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/PrimitiveArrayTypeInfo.java
|
Apache-2.0
|
@Override
@PublicEvolving
public PrimitiveArrayComparator<T, ?> createComparator(
boolean sortOrderAscending, ExecutionConfig executionConfig) {
try {
return comparatorClass.getConstructor(boolean.class).newInstance(sortOrderAscending);
} catch (Exception e) {
throw new RuntimeException(
"Could not initialize primitive "
+ comparatorClass.getName()
+ " array comparator.",
e);
}
}
|
Static map from array class to type info.
|
createComparator
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeinfo/PrimitiveArrayTypeInfo.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/PrimitiveArrayTypeInfo.java
|
Apache-2.0
|
@PublicEvolving
public Map<String, TypeInformation<?>> getGenericParameters() {
// return an empty map as the default implementation
return Collections.emptyMap();
}
|
Optional method for giving Flink's type extraction system information about the mapping of a
generic type parameter to the type information of a subtype. This information is necessary in
cases where type information should be deduced from an input type.
<p>For instance, a method for a {@link Tuple2} would look like this: <code>
Map m = new HashMap();
m.put("T0", this.getTypeAt(0));
m.put("T1", this.getTypeAt(1));
return m;
</code>
@return map of inferred subtypes; it does not have to contain all generic parameters as key;
values may be null if type could not be inferred
|
getGenericParameters
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInformation.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInformation.java
|
Apache-2.0
|
@PublicEvolving
public boolean isSortKeyType() {
return isKeyType();
}
|
Checks whether this type can be used as a key for sorting. The order produced by sorting this
type must be meaningful.
|
isSortKeyType
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInformation.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInformation.java
|
Apache-2.0
|
public static <T> TypeInformation<T> of(Class<T> typeClass) {
try {
return TypeExtractor.createTypeInfo(typeClass);
} catch (InvalidTypesException e) {
throw new FlinkRuntimeException(
"Cannot extract TypeInformation from Class alone, because generic parameters are missing. "
+ "Please use TypeInformation.of(TypeHint) instead, or another equivalent method in the API that "
+ "accepts a TypeHint instead of a Class. "
+ "For example for a Tuple2<Long, String> pass a 'new TypeHint<Tuple2<Long, String>>(){}'.");
}
}
|
Creates a TypeInformation for the type described by the given class.
<p>This method only works for non-generic types. For generic types, use the {@link
#of(TypeHint)} method.
@param typeClass The class of the type.
@param <T> The generic type.
@return The TypeInformation object for the type described by the hint.
|
of
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInformation.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInformation.java
|
Apache-2.0
|
public static <T> TypeInformation<T> of(TypeHint<T> typeHint) {
return typeHint.getTypeInfo();
}
|
Creates a TypeInformation for a generic type via a utility "type hint". This method can be
used as follows:
<pre>{@code
TypeInformation<Tuple2<String, Long>> info = TypeInformation.of(new TypeHint<Tuple2<String, Long>>(){});
}</pre>
@param typeHint The hint for the generic type.
@param <T> The generic type.
@return The TypeInformation object for the type described by the hint.
|
of
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInformation.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInformation.java
|
Apache-2.0
|
public static <T> TypeInformation<T> GENERIC(Class<T> genericClass) {
return new GenericTypeInfo<>(genericClass);
}
|
Returns generic type information for any Java object. The serialization logic will use the
general purpose serializer Kryo.
<p>Generic types are black-boxes for Flink, but allow any object and null values in fields.
<p>By default, serialization of this type is not very efficient. Please read the
documentation about how to improve efficiency (namely by pre-registering classes).
@param genericClass any Java class
|
GENERIC
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeinfo/Types.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/Types.java
|
Apache-2.0
|
@SuppressWarnings("unchecked")
public static <E> TypeInformation<E[]> OBJECT_ARRAY(TypeInformation<E> elementType) {
if (elementType == Types.STRING) {
return (TypeInformation) BasicArrayTypeInfo.STRING_ARRAY_TYPE_INFO;
}
return ObjectArrayTypeInfo.getInfoFor(elementType);
}
|
Returns type information for Java arrays of object types (such as <code>String[]</code>,
<code>Integer[]</code>). The array itself must not be null. Null values for elements are
supported.
@param elementType element type of the array
|
OBJECT_ARRAY
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeinfo/Types.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/Types.java
|
Apache-2.0
|
public static <K, V> TypeInformation<Map<K, V>> MAP(
TypeInformation<K> keyType, TypeInformation<V> valueType) {
return new MapTypeInfo<>(keyType, valueType);
}
|
Returns type information for a Java {@link java.util.Map}. A map must not be null. Null
values in keys are not supported. An entry's value can be null.
<p>By default, maps are untyped and treated as a generic type in Flink; therefore, it is
useful to pass type information whenever a map is used.
<p><strong>Note:</strong> Flink does not preserve the concrete {@link Map} type. It converts
a map into {@link HashMap} when copying or deserializing.
@param keyType type information for the map's keys
@param valueType type information for the map's values
|
MAP
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeinfo/Types.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/Types.java
|
Apache-2.0
|
public static <E> TypeInformation<List<E>> LIST(TypeInformation<E> elementType) {
return new ListTypeInfo<>(elementType);
}
|
Returns type information for a Java {@link java.util.List}. A list must not be null. Null
values in elements are not supported.
<p>By default, lists are untyped and treated as a generic type in Flink; therefore, it is
useful to pass type information whenever a list is used.
<p><strong>Note:</strong> Flink does not preserve the concrete {@link List} type. It converts
a list into {@link ArrayList} when copying or deserializing.
@param elementType type information for the list's elements
|
LIST
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeinfo/Types.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/Types.java
|
Apache-2.0
|
public static <E> TypeInformation<Set<E>> SET(TypeInformation<E> elementType) {
return new SetTypeInfo<>(elementType);
}
|
Returns type information for a Java {@link java.util.Set}. A set must not be null. Null
values in elements are not supported.
<p>By default, sets are untyped and treated as a generic type in Flink; therefore, it is
useful to pass type information whenever a set is used.
<p><strong>Note:</strong> Flink does not preserve the concrete {@link Set} type. It converts
a list into {@link HashSet} when copying or deserializing.
@param elementType type information for the set's elements
|
SET
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeinfo/Types.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/Types.java
|
Apache-2.0
|
public static <E extends Enum<E>> TypeInformation<E> ENUM(Class<E> enumType) {
return new EnumTypeInfo<>(enumType);
}
|
Returns type information for Java enumerations. Null values are not supported.
@param enumType enumeration class extending {@link java.lang.Enum}
|
ENUM
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeinfo/Types.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/Types.java
|
Apache-2.0
|
static PrecomputedParameters precompute(
boolean immutableTargetType, TypeSerializer<Object>[] fieldSerializers) {
Preconditions.checkNotNull(fieldSerializers);
int totalLength = 0;
boolean fieldsImmutable = true;
boolean stateful = false;
for (TypeSerializer<Object> fieldSerializer : fieldSerializers) {
Preconditions.checkNotNull(fieldSerializer);
if (fieldSerializer != fieldSerializer.duplicate()) {
stateful = true;
}
if (!fieldSerializer.isImmutableType()) {
fieldsImmutable = false;
}
if (fieldSerializer.getLength() < 0) {
totalLength = -1;
}
totalLength =
totalLength >= 0 ? totalLength + fieldSerializer.getLength() : totalLength;
}
return new PrecomputedParameters(
immutableTargetType, fieldsImmutable, totalLength, stateful);
}
|
Whether any field serializer is stateful.
|
precompute
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/CompositeSerializer.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/CompositeSerializer.java
|
Apache-2.0
|
@PublicEvolving
public Class<T> getTypeClass() {
return typeClass;
}
|
Returns the type class of the composite type
@return Type class of the composite type
|
getTypeClass
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/CompositeType.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/CompositeType.java
|
Apache-2.0
|
@PublicEvolving
public List<FlatFieldDescriptor> getFlatFields(String fieldExpression) {
List<FlatFieldDescriptor> result = new ArrayList<FlatFieldDescriptor>();
this.getFlatFields(fieldExpression, 0, result);
return result;
}
|
Returns the flat field descriptors for the given field expression.
@param fieldExpression The field expression for which the flat field descriptors are
computed.
@return The list of descriptors for the flat fields which are specified by the field
expression.
|
getFlatFields
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/CompositeType.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/CompositeType.java
|
Apache-2.0
|
@PublicEvolving
public boolean hasDeterministicFieldOrder() {
return false;
}
|
True if this type has an inherent ordering of the fields, such that a user can always be sure
in which order the fields will be in. This is true for Tuples and Case Classes. It is not
true for Regular Java Objects, since there, the ordering of the fields can be arbitrary.
<p>This is used when translating a DataSet or DataStream to an Expression Table, when
initially renaming the fields of the underlying type.
|
hasDeterministicFieldOrder
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/CompositeType.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/CompositeType.java
|
Apache-2.0
|
protected void readOuterSnapshot(
int readOuterSnapshotVersion, DataInputView in, ClassLoader userCodeClassLoader)
throws IOException {}
|
Reads the outer snapshot, i.e. any information beyond the nested serializers of the outer
serializer.
<p>The base implementation of this methods reads nothing, i.e. it assumes that the outer
serializer only has nested serializers and no extra information. Otherwise, if the outer
serializer contains some extra information that has been persisted as part of the serializer
snapshot, this must be overridden. Note that this method and the corresponding methods {@link
#writeOuterSnapshot(DataOutputView)}, {@link
#resolveOuterSchemaCompatibility(TypeSerializerSnapshot)} needs to be implemented.
@param readOuterSnapshotVersion the read version of the outer snapshot.
@param in the {@link DataInputView} to read the outer snapshot from.
@param userCodeClassLoader the user code class loader.
|
readOuterSnapshot
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/CompositeTypeSerializerSnapshot.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/CompositeTypeSerializerSnapshot.java
|
Apache-2.0
|
protected OuterSchemaCompatibility resolveOuterSchemaCompatibility(
TypeSerializerSnapshot<T> oldSerializerSnapshot) {
// Call deprecated methods as default, which will be removed after removing these deprecated
// methods
@SuppressWarnings("unchecked")
S newSerializer = (S) this.restoreSerializer();
if (isOuterSnapshotCompatible(newSerializer)) {
if (oldSerializerSnapshot instanceof CompositeTypeSerializerSnapshot) {
return ((CompositeTypeSerializerSnapshot<T, S>) oldSerializerSnapshot)
.resolveOuterSchemaCompatibility(newSerializer);
}
return OuterSchemaCompatibility.COMPATIBLE_AS_IS;
}
return OuterSchemaCompatibility.INCOMPATIBLE;
}
|
Checks the schema compatibility of the given old serializer snapshot based on the outer
snapshot.
<p>The base implementation of this method assumes that the outer serializer only has nested
serializers and no extra information, and therefore the result of the check is {@link
OuterSchemaCompatibility#COMPATIBLE_AS_IS}. Otherwise, if the outer serializer contains some
extra information that has been persisted as part of the serializer snapshot, this must be
overridden. Note that this method and the corresponding methods {@link
#writeOuterSnapshot(DataOutputView)}, {@link #readOuterSnapshot(int, DataInputView,
ClassLoader)} needs to be implemented.
@param oldSerializerSnapshot the old serializer snapshot, which contains the old outer
information to check against.
@return a {@link OuterSchemaCompatibility} indicating whether the new serializer's outer
information is compatible, requires migration, or incompatible with the one written in
this snapshot.
|
resolveOuterSchemaCompatibility
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/CompositeTypeSerializerSnapshot.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/CompositeTypeSerializerSnapshot.java
|
Apache-2.0
|
@Deprecated
protected boolean isOuterSnapshotCompatible(S newSerializer) {
return true;
}
|
Checks whether the outer snapshot is compatible with a given new serializer.
<p>The base implementation of this method just returns {@code true}, i.e. it assumes that the
outer serializer only has nested serializers and no extra information, and therefore the
result of the check must always be true. Otherwise, if the outer serializer contains some
extra information that has been persisted as part of the serializer snapshot, this must be
overridden. Note that this method and the corresponding methods {@link
#writeOuterSnapshot(DataOutputView)}, {@link #readOuterSnapshot(int, DataInputView,
ClassLoader)} needs to be implemented.
@param newSerializer the new serializer, which contains the new outer information to check
against.
@return a flag indicating whether or not the new serializer's outer information is compatible
with the one written in this snapshot.
@deprecated this method is deprecated, and will be removed in the future. Please implement
{@link #resolveOuterSchemaCompatibility(TypeSerializerSnapshot)} instead.
|
isOuterSnapshotCompatible
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/CompositeTypeSerializerSnapshot.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/CompositeTypeSerializerSnapshot.java
|
Apache-2.0
|
public static <T> TypeSerializerSchemaCompatibility<T> delegateCompatibilityCheckToNewSnapshot(
TypeSerializerSnapshot<T> legacySerializerSnapshot,
CompositeTypeSerializerSnapshot<T, ? extends TypeSerializer<T>> newCompositeSnapshot,
TypeSerializerSnapshot<?>... legacyNestedSnapshots) {
checkArgument(legacyNestedSnapshots.length > 0);
return newCompositeSnapshot.internalResolveSchemaCompatibility(
legacySerializerSnapshot, legacyNestedSnapshots);
}
|
Delegates compatibility checks to a {@link CompositeTypeSerializerSnapshot} instance. This
can be used by legacy snapshot classes, which have a newer implementation implemented as a
{@link CompositeTypeSerializerSnapshot}.
@param legacySerializerSnapshot the legacy serializer snapshot to check for compatibility.
@param newCompositeSnapshot an instance of the new snapshot class to delegate compatibility
checks to. This instance should already contain the outer snapshot information.
@param legacyNestedSnapshots the nested serializer snapshots of the legacy composite
snapshot.
@return the result compatibility.
|
delegateCompatibilityCheckToNewSnapshot
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/CompositeTypeSerializerUtil.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/CompositeTypeSerializerUtil.java
|
Apache-2.0
|
public static void setNestedSerializersSnapshots(
CompositeTypeSerializerSnapshot<?, ?> compositeSnapshot,
TypeSerializerSnapshot<?>... nestedSnapshots) {
NestedSerializersSnapshotDelegate delegate =
new NestedSerializersSnapshotDelegate(nestedSnapshots);
compositeSnapshot.setNestedSerializersSnapshotDelegate(delegate);
}
|
Overrides the existing nested serializer's snapshots with the provided {@code
nestedSnapshots}.
@param compositeSnapshot the composite snapshot to overwrite its nested serializers.
@param nestedSnapshots the nested snapshots to overwrite with.
|
setNestedSerializersSnapshots
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/CompositeTypeSerializerUtil.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/CompositeTypeSerializerUtil.java
|
Apache-2.0
|
public TypeSerializer<?>[] getRestoredNestedSerializers() {
return snapshotsToRestoreSerializers(nestedSnapshots);
}
|
Produces a restore serializer from each contained serializer configuration snapshot. The
serializers are returned in the same order as the snapshots are stored.
|
getRestoredNestedSerializers
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/NestedSerializersSnapshotDelegate.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/NestedSerializersSnapshotDelegate.java
|
Apache-2.0
|
public <T> TypeSerializer<T> getRestoredNestedSerializer(int pos) {
checkArgument(pos < nestedSnapshots.length);
@SuppressWarnings("unchecked")
TypeSerializerSnapshot<T> snapshot = (TypeSerializerSnapshot<T>) nestedSnapshots[pos];
return snapshot.restoreSerializer();
}
|
Creates the restore serializer from the pos-th config snapshot.
|
getRestoredNestedSerializer
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/NestedSerializersSnapshotDelegate.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/NestedSerializersSnapshotDelegate.java
|
Apache-2.0
|
public TypeSerializerSnapshot<?>[] getNestedSerializerSnapshots() {
return nestedSnapshots;
}
|
Returns the snapshots of the nested serializers.
@return the snapshots of the nested serializers.
|
getNestedSerializerSnapshots
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/NestedSerializersSnapshotDelegate.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/NestedSerializersSnapshotDelegate.java
|
Apache-2.0
|
public final void writeNestedSerializerSnapshots(DataOutputView out) throws IOException {
out.writeInt(MAGIC_NUMBER);
out.writeInt(VERSION);
out.writeInt(nestedSnapshots.length);
for (TypeSerializerSnapshot<?> snap : nestedSnapshots) {
TypeSerializerSnapshot.writeVersionedSnapshot(out, snap);
}
}
|
Writes the composite snapshot of all the contained serializers.
|
writeNestedSerializerSnapshots
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/NestedSerializersSnapshotDelegate.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/NestedSerializersSnapshotDelegate.java
|
Apache-2.0
|
public boolean supportsCompareAgainstReference() {
return false;
}
|
This method compares the element that has been set as reference in this type accessor, to the
element set as reference in the given type accessor. Similar to comparing two elements {@code
e1} and {@code e2} via a comparator, this method can be used the following way.
<pre>{@code
E e1 = ...;
E e2 = ...;
TypeComparator<E> acc1 = ...;
TypeComparator<E> acc2 = ...;
acc1.setReference(e1);
acc2.setReference(e2);
int comp = acc1.compareToReference(acc2);
}</pre>
The rational behind this method is that elements are typically compared using certain
features that are extracted from them, (such de-serializing as a subset of fields). When
setting the reference, this extraction happens. The extraction needs happen only once per
element, even though an element is typically compared to many other elements when
establishing a sorted order. The actual comparison performed by this method may be very
cheap, as it happens on the extracted features.
@param referencedComparator The type accessors where the element for comparison has been set
as reference.
@return A value smaller than zero, if the reference value of {@code referencedAccessors} is
smaller than the reference value of this type accessor; a value greater than zero, if it
is larger; zero, if both are equal.
@see #setReference(Object)
|
supportsCompareAgainstReference
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeComparator.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeComparator.java
|
Apache-2.0
|
@SuppressWarnings("rawtypes")
public int compareAgainstReference(Comparable[] keys) {
throw new UnsupportedOperationException("Workaround hack.");
}
|
Get the field comparators. This is used together with {@link #extractKeys(Object, Object[],
int)} to provide interoperability between different record types. Note, that this should
return at least one Comparator and that the number of Comparators must match the number of
extracted keys.
@return An Array of Comparators for the extracted keys.
|
compareAgainstReference
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeComparator.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeComparator.java
|
Apache-2.0
|
public static <T> TypeSerializerSchemaCompatibility<T> compatibleAsIs() {
return new TypeSerializerSchemaCompatibility<>(Type.COMPATIBLE_AS_IS, null);
}
|
Returns a result that indicates that the new serializer is compatible and no migration is
required. The new serializer can continued to be used as is.
@return a result that indicates migration is not required for the new serializer.
|
compatibleAsIs
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSchemaCompatibility.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSchemaCompatibility.java
|
Apache-2.0
|
public static <T> TypeSerializerSchemaCompatibility<T> compatibleAfterMigration() {
return new TypeSerializerSchemaCompatibility<>(Type.COMPATIBLE_AFTER_MIGRATION, null);
}
|
Returns a result that indicates that the new serializer can be used after migrating the
written bytes, i.e. reading it with the old serializer and then writing it again with the new
serializer.
@return a result that indicates that the new serializer can be used after migrating the
written bytes.
|
compatibleAfterMigration
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSchemaCompatibility.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSchemaCompatibility.java
|
Apache-2.0
|
public static <T> TypeSerializerSchemaCompatibility<T> compatibleWithReconfiguredSerializer(
TypeSerializer<T> reconfiguredSerializer) {
return new TypeSerializerSchemaCompatibility<>(
Type.COMPATIBLE_WITH_RECONFIGURED_SERIALIZER,
Preconditions.checkNotNull(reconfiguredSerializer));
}
|
Returns a result that indicates a reconfigured version of the new serializer is compatible,
and should be used instead of the original new serializer.
@param reconfiguredSerializer the reconfigured version of the new serializer.
@return a result that indicates a reconfigured version of the new serializer is compatible,
and should be used instead of the original new serializer.
|
compatibleWithReconfiguredSerializer
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSchemaCompatibility.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSchemaCompatibility.java
|
Apache-2.0
|
public static <T> TypeSerializerSchemaCompatibility<T> incompatible() {
return new TypeSerializerSchemaCompatibility<>(Type.INCOMPATIBLE, null);
}
|
Returns a result that indicates there is no possible way for the new serializer to be
use-able. This normally indicates that there is no common Java class between what the
previous bytes can be deserialized into and what can be written by the new serializer.
<p>In this case, there is no possible way for the new serializer to continue to be used, even
with migration. Recovery of the Flink job will fail.
@return a result that indicates incompatibility between the new and previous serializer.
|
incompatible
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSchemaCompatibility.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSchemaCompatibility.java
|
Apache-2.0
|
public boolean isCompatibleAsIs() {
return resultType == Type.COMPATIBLE_AS_IS;
}
|
Returns whether or not the type of the compatibility is {@link Type#COMPATIBLE_AS_IS}.
@return whether or not the type of the compatibility is {@link Type#COMPATIBLE_AS_IS}.
|
isCompatibleAsIs
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSchemaCompatibility.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSchemaCompatibility.java
|
Apache-2.0
|
public boolean isCompatibleAfterMigration() {
return resultType == Type.COMPATIBLE_AFTER_MIGRATION;
}
|
Returns whether or not the type of the compatibility is {@link
Type#COMPATIBLE_AFTER_MIGRATION}.
@return whether or not the type of the compatibility is {@link
Type#COMPATIBLE_AFTER_MIGRATION}.
|
isCompatibleAfterMigration
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSchemaCompatibility.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSchemaCompatibility.java
|
Apache-2.0
|
public boolean isCompatibleWithReconfiguredSerializer() {
return resultType == Type.COMPATIBLE_WITH_RECONFIGURED_SERIALIZER;
}
|
Returns whether or not the type of the compatibility is {@link
Type#COMPATIBLE_WITH_RECONFIGURED_SERIALIZER}.
@return whether or not the type of the compatibility is {@link
Type#COMPATIBLE_WITH_RECONFIGURED_SERIALIZER}.
|
isCompatibleWithReconfiguredSerializer
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSchemaCompatibility.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSchemaCompatibility.java
|
Apache-2.0
|
public TypeSerializer<T> getReconfiguredSerializer() {
Preconditions.checkState(
isCompatibleWithReconfiguredSerializer(),
"It is only possible to get a reconfigured serializer if the compatibility type is %s, but the type is %s",
Type.COMPATIBLE_WITH_RECONFIGURED_SERIALIZER,
resultType);
return reconfiguredNewSerializer;
}
|
Gets the reconfigured serializer. This throws an exception if {@link
#isCompatibleWithReconfiguredSerializer()} is {@code false}.
|
getReconfiguredSerializer
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSchemaCompatibility.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSchemaCompatibility.java
|
Apache-2.0
|
static void writeVersionedSnapshot(DataOutputView out, TypeSerializerSnapshot<?> snapshot)
throws IOException {
out.writeUTF(snapshot.getClass().getName());
out.writeInt(snapshot.getCurrentVersion());
snapshot.writeSnapshot(out);
}
|
Writes the given snapshot to the out stream. One should always use this method to write
snapshots out, rather than directly calling {@link #writeSnapshot(DataOutputView)}.
<p>The snapshot written with this method can be read via {@link
#readVersionedSnapshot(DataInputView, ClassLoader)}.
|
writeVersionedSnapshot
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSnapshot.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSnapshot.java
|
Apache-2.0
|
static <T> TypeSerializerSnapshot<T> readVersionedSnapshot(DataInputView in, ClassLoader cl)
throws IOException {
final TypeSerializerSnapshot<T> snapshot =
TypeSerializerSnapshotSerializationUtil.readAndInstantiateSnapshotClass(in, cl);
int version = in.readInt();
snapshot.readSnapshot(version, in, cl);
return snapshot;
}
|
Reads a snapshot from the stream, performing resolving
<p>This method reads snapshots written by {@link #writeVersionedSnapshot(DataOutputView,
TypeSerializerSnapshot)}.
|
readVersionedSnapshot
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSnapshot.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSnapshot.java
|
Apache-2.0
|
public static <T> void writeSerializerSnapshot(
DataOutputView out, TypeSerializerSnapshot<T> serializerSnapshot) throws IOException {
new TypeSerializerSnapshotSerializationProxy<>(serializerSnapshot).write(out);
}
|
Writes a {@link TypeSerializerSnapshot} to the provided data output view.
<p>It is written with a format that can be later read again using {@link
#readSerializerSnapshot(DataInputView, ClassLoader)}.
@param out the data output view
@param serializerSnapshot the serializer configuration snapshot to write
|
writeSerializerSnapshot
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSnapshotSerializationUtil.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerSnapshotSerializationUtil.java
|
Apache-2.0
|
public static TypeSerializerSnapshot<?>[] snapshot(
TypeSerializer<?>... originatingSerializers) {
return Arrays.stream(originatingSerializers)
.map(
(TypeSerializer<?> originatingSerializer) ->
originatingSerializer.snapshotConfiguration())
.toArray(TypeSerializerSnapshot[]::new);
}
|
Takes snapshots of the given serializers.
|
snapshot
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerUtils.java
|
Apache-2.0
|
public static <T> Sink<T> wrapSink(org.apache.flink.api.connector.sink2.Sink<T> sink) {
return new WrappedSink<>(sink);
}
|
Wrap a sink-v2 based sink to a DataStream V2 supported sink.
@param sink The sink-v2 based sink to wrap.
@return The DataStream V2 supported sink.
|
wrapSink
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/connector/dsv2/DataStreamV2SinkUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/connector/dsv2/DataStreamV2SinkUtils.java
|
Apache-2.0
|
public static <T> Source<T> wrapSource(
org.apache.flink.api.connector.source.Source<T, ?, ?> source) {
return new WrappedSource<>(source);
}
|
Wrap a FLIP-27 based source to a DataStream V2 supported source.
@param source The FLIP-27 based source to wrap.
@return The DataStream V2 supported source.
|
wrapSource
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/connector/dsv2/DataStreamV2SourceUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/connector/dsv2/DataStreamV2SourceUtils.java
|
Apache-2.0
|
public static <T> Source<T> fromData(Collection<T> data) {
Preconditions.checkNotNull(data, "Collection must not be null");
return new FromDataSource<>(data);
}
|
Creates a source that contains the given elements.The type of the data stream is that of the
elements in the collection.
@param data The collection of elements to create the source from.
@param <T> The generic type of the returned data stream.
@return The source representing the given collection
|
fromData
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/connector/dsv2/DataStreamV2SourceUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/connector/dsv2/DataStreamV2SourceUtils.java
|
Apache-2.0
|
public org.apache.flink.api.connector.sink2.Sink<T> getWrappedSink() {
return wrappedSink;
}
|
A simple {@link Sink} implementation that wrap a sink-v2 based sink.
|
getWrappedSink
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/connector/dsv2/WrappedSink.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/connector/dsv2/WrappedSink.java
|
Apache-2.0
|
public org.apache.flink.api.connector.source.Source<T, ?, ?> getWrappedSource() {
return wrappedSource;
}
|
A simple {@link Source} implementation that wrap a FLIP-27 source.
|
getWrappedSource
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/connector/dsv2/WrappedSource.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/connector/dsv2/WrappedSource.java
|
Apache-2.0
|
default <MetaT> Optional<Consumer<MetaT>> metadataConsumer() {
return Optional.empty();
}
|
Returns a metadata consumer, the {@link SinkWriter} can publish metadata events of type
{@link MetaT} to the consumer.
<p>It is recommended to use a separate thread pool to publish the metadata because enqueuing
a lot of these messages in the mailbox may lead to a performance decrease. thread, and the
{@link Consumer#accept} method is executed very fast.
|
metadataConsumer
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/connector/sink2/WriterInitContext.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/connector/sink2/WriterInitContext.java
|
Apache-2.0
|
public int getSubtaskId() {
return subtaskId;
}
|
@return the ID of the subtask that runs the source reader.
|
getSubtaskId
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/connector/source/ReaderInfo.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/connector/source/ReaderInfo.java
|
Apache-2.0
|
@PublicEvolving
default void pauseOrResumeSplits(
Collection<String> splitsToPause, Collection<String> splitsToResume) {
throw new UnsupportedOperationException(
"This source reader does not support pausing or resuming splits which can lead to unaligned splits.\n"
+ "Unaligned splits are splits where the output watermarks of the splits have diverged more than the allowed limit.\n"
+ "It is highly discouraged to use unaligned source splits, as this leads to unpredictable\n"
+ "watermark alignment if there is more than a single split per reader. It is recommended to implement pausing splits\n"
+ "for this source. At your own risk, you can allow unaligned source splits by setting the\n"
+ "configuration parameter `pipeline.watermark-alignment.allow-unaligned-source-splits' to true.\n"
+ "Beware that this configuration parameter will be dropped in a future Flink release.");
}
|
Pauses or resumes reading of individual source splits.
<p>Note that no other methods can be called in parallel, so updating subscriptions can be
done atomically. This method is simply providing connectors with more expressive APIs the
opportunity to update all subscriptions at once.
<p>This is currently used to align the watermarks of splits, if watermark alignment is used
and the source reads from more than one split.
<p>The default implementation throws an {@link UnsupportedOperationException} where the
default implementation will be removed in future releases. To be compatible with future
releases, it is recommended to implement this method and override the default implementation.
@param splitsToPause the splits to pause
@param splitsToResume the splits to resume
|
pauseOrResumeSplits
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/connector/source/SourceReader.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/connector/source/SourceReader.java
|
Apache-2.0
|
default int currentParallelism() {
throw new UnsupportedOperationException();
}
|
Get the current parallelism of this Source.
@return the parallelism of the Source.
|
currentParallelism
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/connector/source/SourceReaderContext.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/connector/source/SourceReaderContext.java
|
Apache-2.0
|
default void emitWatermark(Watermark watermark) {
throw new UnsupportedOperationException();
}
|
Send the watermark to source output.
<p>This should only be used for datastream v2.
|
emitWatermark
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/connector/source/SourceReaderContext.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/connector/source/SourceReaderContext.java
|
Apache-2.0
|
default void sendEventToSourceReader(int subtaskId, int attemptNumber, SourceEvent event) {
throw new UnsupportedOperationException();
}
|
Send a source event to a source reader. The source reader is identified by its subtask id and
attempt number. It is similar to {@link #sendEventToSourceReader(int, SourceEvent)} but it is
aware of the subtask execution attempt to send this event to.
<p>The {@link SplitEnumerator} must invoke this method instead of {@link
#sendEventToSourceReader(int, SourceEvent)} if it is used in cases that a subtask can have
multiple concurrent execution attempts, e.g. if speculative execution is enabled. Otherwise
an error will be thrown when the split enumerator tries to send a custom source event.
@param subtaskId the subtask id of the source reader to send this event to.
@param attemptNumber the attempt number of the source reader to send this event to.
@param event the source event to send.
|
sendEventToSourceReader
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/connector/source/SplitEnumeratorContext.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/connector/source/SplitEnumeratorContext.java
|
Apache-2.0
|
default Map<Integer, Map<Integer, ReaderInfo>> registeredReadersOfAttempts() {
throw new UnsupportedOperationException();
}
|
Get the currently registered readers of all the subtask attempts. The mapping is from subtask
id to a map which maps an attempt to its reader info.
@return the currently registered readers.
|
registeredReadersOfAttempts
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/connector/source/SplitEnumeratorContext.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/connector/source/SplitEnumeratorContext.java
|
Apache-2.0
|
default void assignSplit(SplitT split, int subtask) {
assignSplits(new SplitsAssignment<>(split, subtask));
}
|
Assigns a single split.
<p>When assigning multiple splits, it is more efficient to assign all of them in a single
call to the {@link #assignSplits(SplitsAssignment)} method.
@param split The new split
@param subtask The index of the operator's parallel subtask that shall receive the split.
|
assignSplit
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/connector/source/SplitEnumeratorContext.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/connector/source/SplitEnumeratorContext.java
|
Apache-2.0
|
@Override
protected E convert(E value) {
return value;
}
|
A {@link SourceReader} that returns the values of an iterator, supplied via an {@link
IteratorSourceSplit}.
<p>The {@code IteratorSourceSplit} is also responsible for taking the current iterator and
turning it back into a split for checkpointing.
@param <E> The type of events returned by the reader.
@param <IterT> The type of the iterator that produces the events. This type exists to make the
conversion between iterator and {@code IteratorSourceSplit} type safe.
@param <SplitT> The concrete type of the {@code IteratorSourceSplit} that creates and converts
the iterator that produces this reader's elements.
|
convert
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/connector/source/lib/util/IteratorSourceReader.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/connector/source/lib/util/IteratorSourceReader.java
|
Apache-2.0
|
public int getId() {
return id;
}
|
Changes the description of this {@code Transformation}.
|
getId
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
Apache-2.0
|
public boolean isParallelismConfigured() {
return parallelismConfigured;
}
|
Sets the maximum parallelism for this stream transformation.
@param maxParallelism Maximum parallelism for this stream transformation.
|
isParallelismConfigured
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
Apache-2.0
|
public void declareManagedMemoryUseCaseAtSlotScope(ManagedMemoryUseCase managedMemoryUseCase) {
checkNotNull(managedMemoryUseCase);
checkArgument(managedMemoryUseCase.scope == ManagedMemoryUseCase.Scope.SLOT);
managedMemorySlotScopeUseCases.add(managedMemoryUseCase);
}
|
Declares that this transformation contains certain slot scope managed memory use case.
@param managedMemoryUseCase The use case that this transformation declares needing managed
memory for.
|
declareManagedMemoryUseCaseAtSlotScope
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
Apache-2.0
|
protected void updateManagedMemoryStateBackendUseCase(boolean hasStateBackend) {
if (hasStateBackend) {
managedMemorySlotScopeUseCases.add(ManagedMemoryUseCase.STATE_BACKEND);
} else {
managedMemorySlotScopeUseCases.remove(ManagedMemoryUseCase.STATE_BACKEND);
}
}
|
Get operator scope use cases that this transformation needs managed memory for, and the
use-case-specific weights for this transformation. The weights are used for sharing managed
memory across transformations for the use cases. Check the individual {@link
ManagedMemoryUseCase} for the specific weight definition.
|
updateManagedMemoryStateBackendUseCase
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
Apache-2.0
|
public void setUidHash(String uidHash) {
checkNotNull(uidHash);
checkArgument(
uidHash.matches("^[0-9A-Fa-f]{32}$"),
"Node hash must be a 32 character String that describes a hex code. Found: "
+ uidHash);
this.userProvidedNodeHash = uidHash;
}
|
Gets the user provided hash.
@return The user provided hash.
|
setUidHash
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
Apache-2.0
|
public String getUserProvidedNodeHash() {
return userProvidedNodeHash;
}
|
Sets an ID for this {@link Transformation}. This is will later be hashed to a uidHash which
is then used to create the JobVertexID (that is shown in logs and the web ui).
<p>The specified ID is used to assign the same operator ID across job submissions (for
example when starting a job from a savepoint).
<p><strong>Important</strong>: this ID needs to be unique per transformation and job.
Otherwise, job submission will fail.
@param uid The unique user-specified ID of this transformation.
|
getUserProvidedNodeHash
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
Apache-2.0
|
public Optional<SlotSharingGroup> getSlotSharingGroup() {
return slotSharingGroup;
}
|
Sets the slot sharing group of this transformation. Parallel instances of operations that are
in the same slot sharing group will be co-located in the same TaskManager slot, if possible.
<p>Initially, an operation is in the default slot sharing group. This can be explicitly set
using {@code setSlotSharingGroup("default")}.
@param slotSharingGroupName The slot sharing group's name.
|
getSlotSharingGroup
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
Apache-2.0
|
public void setSlotSharingGroup(String slotSharingGroupName) {
this.slotSharingGroup =
Optional.of(SlotSharingGroup.newBuilder(slotSharingGroupName).build());
}
|
Sets the slot sharing group of this transformation. Parallel instances of operations that are
in the same slot sharing group will be co-located in the same TaskManager slot, if possible.
<p>Initially, an operation is in the default slot sharing group. This can be explicitly set
with constructing a {@link SlotSharingGroup} with name {@code "default"}.
@param slotSharingGroup which contains name and its resource spec.
|
setSlotSharingGroup
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
Apache-2.0
|
public void setSlotSharingGroup(SlotSharingGroup slotSharingGroup) {
this.slotSharingGroup = Optional.of(slotSharingGroup);
}
|
<b>NOTE:</b> This is an internal undocumented feature for now. It is not clear whether this
will be supported and stable in the long term.
<p>Sets the key that identifies the co-location group. Operators with the same co-location
key will have their corresponding subtasks placed into the same slot by the scheduler.
<p>Setting this to null means there is no co-location constraint.
|
setSlotSharingGroup
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
Apache-2.0
|
public void setCoLocationGroupKey(@Nullable String coLocationGroupKey) {
this.coLocationGroupKey = coLocationGroupKey;
}
|
<b>NOTE:</b> This is an internal undocumented feature for now. It is not clear whether this
will be supported and stable in the long term.
<p>Gets the key that identifies the co-location group. Operators with the same co-location
key will have their corresponding subtasks placed into the same slot by the scheduler.
<p>If this is null (which is the default), it means there is no co-location constraint.
|
setCoLocationGroupKey
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
Apache-2.0
|
public TypeInformation<T> getOutputType() {
if (outputType instanceof MissingTypeInfo) {
MissingTypeInfo typeInfo = (MissingTypeInfo) this.outputType;
throw new InvalidTypesException(
"The return type of function '"
+ typeInfo.getFunctionName()
+ "' could not be determined automatically, due to type erasure. "
+ "You can give type information hints by using the returns(...) "
+ "method on the result of the transformation call, or by letting "
+ "your function implement the 'ResultTypeQueryable' "
+ "interface.",
typeInfo.getTypeException());
}
typeUsed = true;
return this.outputType;
}
|
Returns the output type of this {@code Transformation} as a {@link TypeInformation}. Once
this is used once the output type cannot be changed anymore using {@link #setOutputType}.
@return The output type of this {@code Transformation}
|
getOutputType
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/dag/Transformation.java
|
Apache-2.0
|
@Override
public Byte getKey(T value) throws Exception {
return 0;
}
|
Used as a dummy {@link KeySelector} to allow using keyed operators for non-keyed use cases.
Essentially, it gives all incoming records the same key, which is a {@code (byte) 0} value.
@param <T> The type of the input element.
|
getKey
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/functions/NullByteKeySelector.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/functions/NullByteKeySelector.java
|
Apache-2.0
|
public static AvroUtils getAvroUtils() {
// try and load the special AvroUtils from the flink-avro package
try {
Class<?> clazz =
Class.forName(
AVRO_KRYO_UTILS, false, Thread.currentThread().getContextClassLoader());
return clazz.asSubclass(AvroUtils.class).getConstructor().newInstance();
} catch (ClassNotFoundException e) {
// cannot find the utils, return the default implementation
return new DefaultAvroUtils();
} catch (Exception e) {
throw new RuntimeException("Could not instantiate " + AVRO_KRYO_UTILS + ".", e);
}
}
|
Returns either the default {@link AvroUtils} which throw an exception in cases where Avro
would be needed or loads the specific utils for Avro from flink-avro.
|
getAvroUtils
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/AvroUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/AvroUtils.java
|
Apache-2.0
|
public TypeInformation<K> getKeyTypeInfo() {
return keyTypeInfo;
}
|
Gets the type information for the keys in the map
|
getKeyTypeInfo
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/MapTypeInfo.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/MapTypeInfo.java
|
Apache-2.0
|
public static RowTypeInfo projectFields(RowTypeInfo rowType, int[] fieldMapping) {
TypeInformation[] fieldTypes = new TypeInformation[fieldMapping.length];
String[] fieldNames = new String[fieldMapping.length];
for (int i = 0; i < fieldMapping.length; i++) {
fieldTypes[i] = rowType.getTypeAt(fieldMapping[i]);
fieldNames[i] = rowType.getFieldNames()[fieldMapping[i]];
}
return new RowTypeInfo(fieldTypes, fieldNames);
}
|
Creates a {@link RowTypeInfo} with projected fields.
@param rowType The original RowTypeInfo whose fields are projected
@param fieldMapping The field mapping of the projection
@return A RowTypeInfo with projected fields.
|
projectFields
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/RowTypeInfo.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/RowTypeInfo.java
|
Apache-2.0
|
public static LambdaExecutable checkAndExtractLambda(Function function)
throws TypeExtractionException {
try {
// get serialized lambda
SerializedLambda serializedLambda = null;
for (Class<?> clazz = function.getClass();
clazz != null;
clazz = clazz.getSuperclass()) {
try {
Method replaceMethod = clazz.getDeclaredMethod("writeReplace");
replaceMethod.setAccessible(true);
Object serialVersion = replaceMethod.invoke(function);
// check if class is a lambda function
if (serialVersion != null
&& serialVersion.getClass() == SerializedLambda.class) {
serializedLambda = (SerializedLambda) serialVersion;
break;
}
} catch (NoSuchMethodException e) {
// thrown if the method is not there. fall through the loop
}
}
// not a lambda method -> return null
if (serializedLambda == null) {
return null;
}
// find lambda method
String className = serializedLambda.getImplClass();
String methodName = serializedLambda.getImplMethodName();
String methodSig = serializedLambda.getImplMethodSignature();
Class<?> implClass =
Class.forName(
className.replace('/', '.'),
true,
Thread.currentThread().getContextClassLoader());
// find constructor
if (methodName.equals("<init>")) {
Constructor<?>[] constructors = implClass.getDeclaredConstructors();
for (Constructor<?> constructor : constructors) {
if (getConstructorDescriptor(constructor).equals(methodSig)) {
return new LambdaExecutable(constructor);
}
}
}
// find method
else {
List<Method> methods = getAllDeclaredMethods(implClass);
for (Method method : methods) {
if (method.getName().equals(methodName)
&& getMethodDescriptor(method).equals(methodSig)) {
return new LambdaExecutable(method);
}
}
}
throw new TypeExtractionException("No lambda method found.");
} catch (Exception e) {
throw new TypeExtractionException(
"Could not extract lambda method out of function: "
+ e.getClass().getSimpleName()
+ " - "
+ e.getMessage(),
e);
}
}
|
Checks if the given function has been implemented using a Java 8 lambda. If yes, a
LambdaExecutable is returned describing the method/constructor. Otherwise null.
@throws TypeExtractionException lambda extraction is pretty hacky, it might fail for unknown
JVM issues.
|
checkAndExtractLambda
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
Apache-2.0
|
public static Type extractTypeFromLambda(
Class<?> baseClass,
LambdaExecutable exec,
int[] lambdaTypeArgumentIndices,
int paramLen,
int baseParametersLen) {
Type output =
exec.getParameterTypes()[
paramLen - baseParametersLen + lambdaTypeArgumentIndices[0]];
for (int i = 1; i < lambdaTypeArgumentIndices.length; i++) {
validateLambdaType(baseClass, output);
output = extractTypeArgument(output, lambdaTypeArgumentIndices[i]);
}
validateLambdaType(baseClass, output);
return output;
}
|
Extracts type from given index from lambda. It supports nested types.
@param baseClass SAM function that the lambda implements
@param exec lambda function to extract the type from
@param lambdaTypeArgumentIndices position of type to extract in type hierarchy
@param paramLen count of total parameters of the lambda (including closure parameters)
@param baseParametersLen count of lambda interface parameters (without closure parameters)
@return extracted type
|
extractTypeFromLambda
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
Apache-2.0
|
public static Type extractTypeArgument(Type t, int index) throws InvalidTypesException {
if (t instanceof ParameterizedType) {
Type[] actualTypeArguments = ((ParameterizedType) t).getActualTypeArguments();
if (index < 0 || index >= actualTypeArguments.length) {
throw new InvalidTypesException(
"Cannot extract the type argument with index "
+ index
+ " because the type has only "
+ actualTypeArguments.length
+ " type arguments.");
} else {
return actualTypeArguments[index];
}
} else {
throw new InvalidTypesException(
"The given type " + t + " is not a parameterized type.");
}
}
|
This method extracts the n-th type argument from the given type. An InvalidTypesException is
thrown if the type does not have any type arguments or if the index exceeds the number of
type arguments.
@param t Type to extract the type arguments from
@param index Index of the type argument to extract
@return The extracted type argument
@throws InvalidTypesException if the given type does not have any type arguments or if the
index exceeds the number of type arguments.
|
extractTypeArgument
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
Apache-2.0
|
public static Method getSingleAbstractMethod(Class<?> baseClass) {
if (!baseClass.isInterface()) {
throw new InvalidTypesException(
"Given class: " + baseClass + "is not a FunctionalInterface.");
}
Method sam = null;
for (Method method : baseClass.getMethods()) {
if (Modifier.isAbstract(method.getModifiers())) {
if (sam == null) {
sam = method;
} else {
throw new InvalidTypesException(
"Given class: "
+ baseClass
+ " is not a FunctionalInterface. It has more than one abstract method.");
}
}
}
if (sam == null) {
throw new InvalidTypesException(
"Given class: "
+ baseClass
+ " is not a FunctionalInterface. It does not have any abstract methods.");
}
return sam;
}
|
Extracts a Single Abstract Method (SAM) as defined in Java Specification (4.3.2. The Class
Object, 9.8 Functional Interfaces, 9.4.3 Interface Method Body) from given class.
@param baseClass a class that is a FunctionalInterface to retrieve a SAM from
@throws InvalidTypesException if the given class does not implement FunctionalInterface
@return single abstract method of the given class
|
getSingleAbstractMethod
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
Apache-2.0
|
public static List<Method> getAllDeclaredMethods(Class<?> clazz) {
List<Method> result = new ArrayList<>();
while (clazz != null) {
Method[] methods = clazz.getDeclaredMethods();
Collections.addAll(result, methods);
clazz = clazz.getSuperclass();
}
return result;
}
|
Returns all declared methods of a class including methods of superclasses.
|
getAllDeclaredMethods
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
Apache-2.0
|
@SuppressWarnings("unchecked")
public static <T> Class<T> typeToClass(Type t) {
if (t instanceof Class) {
return (Class<T>) t;
} else if (t instanceof ParameterizedType) {
return ((Class<T>) ((ParameterizedType) t).getRawType());
}
throw new IllegalArgumentException("Cannot convert type to class");
}
|
Convert ParameterizedType or Class to a Class.
|
typeToClass
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
Apache-2.0
|
public static boolean isClassType(Type t) {
return t instanceof Class<?> || t instanceof ParameterizedType;
}
|
Checks if a type can be converted to a Class. This is true for ParameterizedType and Class.
|
isClassType
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
Apache-2.0
|
public static boolean sameTypeVars(Type t1, Type t2) {
return t1 instanceof TypeVariable
&& t2 instanceof TypeVariable
&& ((TypeVariable<?>) t1).getName().equals(((TypeVariable<?>) t2).getName())
&& ((TypeVariable<?>) t1)
.getGenericDeclaration()
.equals(((TypeVariable<?>) t2).getGenericDeclaration());
}
|
Checks whether two types are type variables describing the same.
|
sameTypeVars
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
Apache-2.0
|
public static Type getTypeHierarchy(List<Type> typeHierarchy, Type t, Class<?> stopAtClass) {
while (!(isClassType(t) && typeToClass(t).equals(stopAtClass))) {
typeHierarchy.add(t);
t = typeToClass(t).getGenericSuperclass();
if (t == null) {
break;
}
}
return t;
}
|
Traverses the type hierarchy of a type up until a certain stop class is found.
@param t type for which a hierarchy need to be created
@return type of the immediate child of the stop class
|
getTypeHierarchy
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
Apache-2.0
|
public static boolean hasSuperclass(Class<?> clazz, String superClassName) {
List<Type> hierarchy = new ArrayList<>();
getTypeHierarchy(hierarchy, clazz, Object.class);
for (Type t : hierarchy) {
if (isClassType(t) && typeToClass(t).getName().equals(superClassName)) {
return true;
}
}
return false;
}
|
Returns true if the given class has a superclass of given name.
@param clazz class to be analyzed
@param superClassName class name of the super class
|
hasSuperclass
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
Apache-2.0
|
public static Class<?> getRawClass(Type t) {
if (isClassType(t)) {
return typeToClass(t);
} else if (t instanceof GenericArrayType) {
Type component = ((GenericArrayType) t).getGenericComponentType();
return Array.newInstance(getRawClass(component), 0).getClass();
}
return Object.class;
}
|
Returns the raw class of both parameterized types and generic arrays. Returns
java.lang.Object for all other types.
|
getRawClass
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
Apache-2.0
|
public static void validateLambdaType(Class<?> baseClass, Type t) {
if (!(t instanceof Class)) {
return;
}
final Class<?> clazz = (Class<?>) t;
if (clazz.getTypeParameters().length > 0) {
throw new InvalidTypesException(
"The generic type parameters of '"
+ clazz.getSimpleName()
+ "' are missing. "
+ "In many cases lambda methods don't provide enough information for automatic type extraction when Java generics are involved. "
+ "An easy workaround is to use an (anonymous) class instead that implements the '"
+ baseClass.getName()
+ "' interface. "
+ "Otherwise the type has to be specified explicitly using type information.");
}
}
|
Checks whether the given type has the generic parameters declared in the class definition.
@param t type to be validated
|
validateLambdaType
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
|
Apache-2.0
|
@SuppressWarnings("unchecked")
@PublicEvolving
public static <IN, OUT> TypeInformation<OUT> getUnaryOperatorReturnType(
Function function,
Class<?> baseClass,
int inputTypeArgumentIndex,
int outputTypeArgumentIndex,
int[] lambdaOutputTypeArgumentIndices,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) {
Preconditions.checkArgument(
inType == null || inputTypeArgumentIndex >= 0,
"Input type argument index was not provided");
Preconditions.checkArgument(
outputTypeArgumentIndex >= 0, "Output type argument index was not provided");
Preconditions.checkArgument(
lambdaOutputTypeArgumentIndices != null,
"Indices for output type arguments within lambda not provided");
// explicit result type has highest precedence
if (function instanceof ResultTypeQueryable) {
return ((ResultTypeQueryable<OUT>) function).getProducedType();
}
// perform extraction
try {
final LambdaExecutable exec;
try {
exec = checkAndExtractLambda(function);
} catch (TypeExtractionException e) {
throw new InvalidTypesException("Internal error occurred.", e);
}
if (exec != null) {
// parameters must be accessed from behind, since JVM can add additional parameters
// e.g. when using local variables inside lambda function
// paramLen is the total number of parameters of the provided lambda, it includes
// parameters added through closure
final int paramLen = exec.getParameterTypes().length;
final Method sam = TypeExtractionUtils.getSingleAbstractMethod(baseClass);
// number of parameters the SAM of implemented interface has; the parameter indexing
// applies to this range
final int baseParametersLen = sam.getParameterCount();
final Type output;
if (lambdaOutputTypeArgumentIndices.length > 0) {
output =
TypeExtractionUtils.extractTypeFromLambda(
baseClass,
exec,
lambdaOutputTypeArgumentIndices,
paramLen,
baseParametersLen);
} else {
output = exec.getReturnType();
TypeExtractionUtils.validateLambdaType(baseClass, output);
}
return new TypeExtractor().privateCreateTypeInfo(output, inType, null);
} else {
if (inType != null) {
validateInputType(
baseClass, function.getClass(), inputTypeArgumentIndex, inType);
}
return new TypeExtractor()
.privateCreateTypeInfo(
baseClass,
function.getClass(),
outputTypeArgumentIndex,
inType,
null);
}
} catch (InvalidTypesException e) {
if (allowMissing) {
return (TypeInformation<OUT>)
new MissingTypeInfo(
functionName != null ? functionName : function.toString(), e);
} else {
throw e;
}
}
}
|
Returns the unary operator's return type.
<p>This method can extract a type in 4 different ways:
<p>1. By using the generics of the base class like MyFunction<X, Y, Z, IN, OUT>. This is what
outputTypeArgumentIndex (in this example "4") is good for.
<p>2. By using input type inference SubMyFunction<T, String, String, String, T>. This is what
inputTypeArgumentIndex (in this example "0") and inType is good for.
<p>3. By using the static method that a compiler generates for Java lambdas. This is what
lambdaOutputTypeArgumentIndices is good for. Given that MyFunction has the following single
abstract method:
<pre>
<code>
void apply(IN value, Collector<OUT> value)
</code>
</pre>
<p>Lambda type indices allow the extraction of a type from lambdas. To extract the output
type <b>OUT</b> from the function one should pass {@code new int[] {1, 0}}. "1" for selecting
the parameter and 0 for the first generic in this type. Use {@code TypeExtractor.NO_INDEX}
for selecting the return type of the lambda for extraction or if the class cannot be a lambda
because it is not a single abstract method interface.
<p>4. By using interfaces such as {@link TypeInfoFactory} or {@link ResultTypeQueryable}.
<p>See also comments in the header of this class.
@param function Function to extract the return type from
@param baseClass Base class of the function
@param inputTypeArgumentIndex Index of input generic type in the base class specification
(ignored if inType is null)
@param outputTypeArgumentIndex Index of output generic type in the base class specification
@param lambdaOutputTypeArgumentIndices Table of indices of the type argument specifying the
input type. See example.
@param inType Type of the input elements (In case of an iterable, it is the element type) or
null
@param functionName Function name
@param allowMissing Can the type information be missing (this generates a MissingTypeInfo for
postponing an exception)
@param <IN> Input type
@param <OUT> Output type
@return TypeInformation of the return type of the function
|
getUnaryOperatorReturnType
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
|
Apache-2.0
|
private <IN1> TypeInformation<?> createTypeInfoFromInput(
TypeVariable<?> returnTypeVar,
List<Type> inputTypeHierarchy,
Type inType,
TypeInformation<IN1> inTypeInfo) {
TypeInformation<?> info = null;
// use a factory to find corresponding type information to type variable
final List<Type> factoryHierarchy = new ArrayList<>(inputTypeHierarchy);
final TypeInfoFactory<?> factory = getClosestFactory(factoryHierarchy, inType);
if (factory != null) {
// the type that defines the factory is last in factory hierarchy
final Type factoryDefiningType = factoryHierarchy.get(factoryHierarchy.size() - 1);
// defining type has generics, the factory need to be asked for a mapping of subtypes to
// type information
if (factoryDefiningType instanceof ParameterizedType) {
final Type[] typeParams = typeToClass(factoryDefiningType).getTypeParameters();
final Type[] actualParams =
((ParameterizedType) factoryDefiningType).getActualTypeArguments();
// go thru all elements and search for type variables
for (int i = 0; i < actualParams.length; i++) {
final Map<String, TypeInformation<?>> componentInfo =
inTypeInfo.getGenericParameters();
final String typeParamName = typeParams[i].toString();
if (!componentInfo.containsKey(typeParamName)
|| componentInfo.get(typeParamName) == null) {
throw new InvalidTypesException(
"TypeInformation '"
+ inTypeInfo.getClass().getSimpleName()
+ "' does not supply a mapping of TypeVariable '"
+ typeParamName
+ "' to corresponding TypeInformation. "
+ "Input type inference can only produce a result with this information. "
+ "Please implement method 'TypeInformation.getGenericParameters()' for this.");
}
info =
createTypeInfoFromInput(
returnTypeVar,
factoryHierarchy,
actualParams[i],
componentInfo.get(typeParamName));
if (info != null) {
break;
}
}
}
}
// the input is a type variable
else if (sameTypeVars(inType, returnTypeVar)) {
return inTypeInfo;
} else if (inType instanceof TypeVariable) {
Type resolvedInType =
materializeTypeVariable(inputTypeHierarchy, (TypeVariable<?>) inType);
if (resolvedInType != inType) {
info =
createTypeInfoFromInput(
returnTypeVar, inputTypeHierarchy, resolvedInType, inTypeInfo);
}
}
// input is an array
else if (inType instanceof GenericArrayType) {
TypeInformation<?> componentInfo = null;
if (inTypeInfo instanceof BasicArrayTypeInfo) {
componentInfo = ((BasicArrayTypeInfo<?, ?>) inTypeInfo).getComponentInfo();
} else if (inTypeInfo instanceof PrimitiveArrayTypeInfo) {
componentInfo =
BasicTypeInfo.getInfoFor(inTypeInfo.getTypeClass().getComponentType());
} else if (inTypeInfo instanceof ObjectArrayTypeInfo) {
componentInfo = ((ObjectArrayTypeInfo<?, ?>) inTypeInfo).getComponentInfo();
}
info =
createTypeInfoFromInput(
returnTypeVar,
inputTypeHierarchy,
((GenericArrayType) inType).getGenericComponentType(),
componentInfo);
}
// the input is a tuple
else if (inTypeInfo instanceof TupleTypeInfo
&& isClassType(inType)
&& Tuple.class.isAssignableFrom(typeToClass(inType))) {
ParameterizedType tupleBaseClass;
// get tuple from possible tuple subclass
while (!(isClassType(inType)
&& typeToClass(inType).getSuperclass().equals(Tuple.class))) {
inputTypeHierarchy.add(inType);
inType = typeToClass(inType).getGenericSuperclass();
}
inputTypeHierarchy.add(inType);
// we can assume to be parameterized since we
// already did input validation
tupleBaseClass = (ParameterizedType) inType;
Type[] tupleElements = tupleBaseClass.getActualTypeArguments();
// go thru all tuple elements and search for type variables
for (int i = 0; i < tupleElements.length; i++) {
info =
createTypeInfoFromInput(
returnTypeVar,
inputTypeHierarchy,
tupleElements[i],
((TupleTypeInfo<?>) inTypeInfo).getTypeAt(i));
if (info != null) {
break;
}
}
}
// the input is a pojo
else if (inTypeInfo instanceof PojoTypeInfo && isClassType(inType)) {
// build the entire type hierarchy for the pojo
getTypeHierarchy(inputTypeHierarchy, inType, Object.class);
// determine a field containing the type variable
List<Field> fields = getAllDeclaredFields(typeToClass(inType), false);
for (Field field : fields) {
Type fieldType = field.getGenericType();
if (fieldType instanceof TypeVariable
&& sameTypeVars(
returnTypeVar,
materializeTypeVariable(
inputTypeHierarchy, (TypeVariable<?>) fieldType))) {
return getTypeOfPojoField(inTypeInfo, field);
} else if (fieldType instanceof ParameterizedType
|| fieldType instanceof GenericArrayType) {
List<Type> typeHierarchyWithFieldType = new ArrayList<>(inputTypeHierarchy);
typeHierarchyWithFieldType.add(fieldType);
TypeInformation<?> foundInfo =
createTypeInfoFromInput(
returnTypeVar,
typeHierarchyWithFieldType,
fieldType,
getTypeOfPojoField(inTypeInfo, field));
if (foundInfo != null) {
return foundInfo;
}
}
}
}
return info;
}
|
Finds the type information to a type variable.
<p>It solve the following:
<p>Return the type information for "returnTypeVar" given that "inType" has type information
"inTypeInfo". Thus "inType" must contain "returnTypeVar" in a "inputTypeHierarchy", otherwise
null is returned.
|
createTypeInfoFromInput
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
|
Apache-2.0
|
private <IN1, IN2> TypeInformation<?>[] createSubTypesInfo(
Type originalType,
ParameterizedType definingType,
List<Type> typeHierarchy,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
boolean lenient) {
Type[] subtypes = new Type[definingType.getActualTypeArguments().length];
// materialize possible type variables
for (int i = 0; i < subtypes.length; i++) {
final Type actualTypeArg = definingType.getActualTypeArguments()[i];
// materialize immediate TypeVariables
if (actualTypeArg instanceof TypeVariable<?>) {
subtypes[i] =
materializeTypeVariable(typeHierarchy, (TypeVariable<?>) actualTypeArg);
}
// class or parameterized type
else {
subtypes[i] = actualTypeArg;
}
}
TypeInformation<?>[] subTypesInfo = new TypeInformation<?>[subtypes.length];
for (int i = 0; i < subtypes.length; i++) {
final List<Type> subTypeHierarchy = new ArrayList<>(typeHierarchy);
subTypeHierarchy.add(subtypes[i]);
// sub type could not be determined with materializing
// try to derive the type info of the TypeVariable from the immediate base child input
// as a last attempt
if (subtypes[i] instanceof TypeVariable<?>) {
subTypesInfo[i] =
createTypeInfoFromInputs(
(TypeVariable<?>) subtypes[i], subTypeHierarchy, in1Type, in2Type);
// variable could not be determined
if (subTypesInfo[i] == null && !lenient) {
throw new InvalidTypesException(
"Type of TypeVariable '"
+ ((TypeVariable<?>) subtypes[i]).getName()
+ "' in '"
+ ((TypeVariable<?>) subtypes[i]).getGenericDeclaration()
+ "' could not be determined. This is most likely a type erasure problem. "
+ "The type extraction currently supports types with generic variables only in cases where "
+ "all variables in the return type can be deduced from the input type(s). "
+ "Otherwise the type has to be specified explicitly using type information.");
}
} else {
// create the type information of the subtype or null/exception
try {
subTypesInfo[i] =
createTypeInfoWithTypeHierarchy(
subTypeHierarchy, subtypes[i], in1Type, in2Type);
} catch (InvalidTypesException e) {
if (lenient) {
subTypesInfo[i] = null;
} else {
throw e;
}
}
}
}
// check that number of fields matches the number of subtypes
if (!lenient) {
Class<?> originalTypeAsClass = null;
if (isClassType(originalType)) {
originalTypeAsClass = typeToClass(originalType);
}
checkNotNull(originalTypeAsClass, "originalType has an unexpected type");
// check if the class we assumed to conform to the defining type so far is actually a
// pojo because the
// original type contains additional fields.
// check for additional fields.
int fieldCount = countFieldsInClass(originalTypeAsClass);
if (fieldCount > subTypesInfo.length) {
return null;
}
}
return subTypesInfo;
}
|
Creates the TypeInformation for all elements of a type that expects a certain number of
subtypes (e.g. TupleXX).
@param originalType most concrete subclass
@param definingType type that defines the number of subtypes (e.g. Tuple2 -> 2 subtypes)
@param typeHierarchy necessary for type inference
@param in1Type necessary for type inference
@param in2Type necessary for type inference
@param lenient decides whether exceptions should be thrown if a subtype can not be determined
@return array containing TypeInformation of sub types or null if definingType contains more
subtypes (fields) that defined
|
createSubTypesInfo
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
|
Apache-2.0
|
@SuppressWarnings("unchecked")
private <IN1, IN2, OUT> TypeInformation<OUT> createTypeInfoFromFactory(
Type t,
List<Type> typeHierarchy,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) {
final List<Type> factoryHierarchy = new ArrayList<>(typeHierarchy);
final TypeInfoFactory<? super OUT> factory = getClosestFactory(factoryHierarchy, t);
if (factory == null) {
return null;
}
final Type factoryDefiningType = factoryHierarchy.get(factoryHierarchy.size() - 1);
return createTypeInfoFromFactory(
t, in1Type, in2Type, factoryHierarchy, factory, factoryDefiningType);
}
|
Creates type information using a factory if for this type or super types. Returns null
otherwise.
|
createTypeInfoFromFactory
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
|
Apache-2.0
|
@Internal
@SuppressWarnings("unchecked")
public static <OUT> TypeInfoFactory<OUT> getTypeInfoFactory(Type t) {
final Class<?> factoryClass;
if (registeredTypeInfoFactories.containsKey(t)) {
factoryClass = registeredTypeInfoFactories.get(t);
} else {
if (!isClassType(t) || !typeToClass(t).isAnnotationPresent(TypeInfo.class)) {
return null;
}
final TypeInfo typeInfoAnnotation = typeToClass(t).getAnnotation(TypeInfo.class);
factoryClass = typeInfoAnnotation.value();
// check for valid factory class
if (!TypeInfoFactory.class.isAssignableFrom(factoryClass)) {
throw new InvalidTypesException(
"TypeInfo annotation does not specify a valid TypeInfoFactory.");
}
}
// instantiate
return (TypeInfoFactory<OUT>) InstantiationUtil.instantiate(factoryClass);
}
|
Returns the type information factory for a type using the factory registry or annotations.
|
getTypeInfoFactory
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
|
Apache-2.0
|
@Internal
@SuppressWarnings("unchecked")
public static <OUT> TypeInfoFactory<OUT> getTypeInfoFactory(Field field) {
if (!isClassType(field.getType()) || !field.isAnnotationPresent(TypeInfo.class)) {
return null;
}
Class<?> factoryClass = field.getAnnotation(TypeInfo.class).value();
// check for valid factory class
if (!TypeInfoFactory.class.isAssignableFrom(factoryClass)) {
throw new InvalidTypesException(
"TypeInfo annotation does not specify a valid TypeInfoFactory.");
}
return (TypeInfoFactory<OUT>) InstantiationUtil.instantiate(factoryClass);
}
|
Returns the type information factory for an annotated field.
|
getTypeInfoFactory
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
|
Apache-2.0
|
private static int countTypeInHierarchy(List<Type> typeHierarchy, Type type) {
int count = 0;
for (Type t : typeHierarchy) {
if (t == type
|| (isClassType(type) && t == typeToClass(type))
|| (isClassType(t) && typeToClass(t) == type)) {
count++;
}
}
return count;
}
|
@return number of items with equal type or same raw type
|
countTypeInHierarchy
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
|
Apache-2.0
|
private static Type materializeTypeVariable(List<Type> typeHierarchy, TypeVariable<?> typeVar) {
TypeVariable<?> inTypeTypeVar = typeVar;
// iterate thru hierarchy from top to bottom until type variable gets a class assigned
for (int i = typeHierarchy.size() - 1; i >= 0; i--) {
Type curT = typeHierarchy.get(i);
// parameterized type
if (curT instanceof ParameterizedType) {
Class<?> rawType = ((Class<?>) ((ParameterizedType) curT).getRawType());
for (int paramIndex = 0;
paramIndex < rawType.getTypeParameters().length;
paramIndex++) {
TypeVariable<?> curVarOfCurT = rawType.getTypeParameters()[paramIndex];
// check if variable names match
if (sameTypeVars(curVarOfCurT, inTypeTypeVar)) {
Type curVarType =
((ParameterizedType) curT).getActualTypeArguments()[paramIndex];
// another type variable level
if (curVarType instanceof TypeVariable<?>) {
inTypeTypeVar = (TypeVariable<?>) curVarType;
}
// class
else {
return curVarType;
}
}
}
}
}
// can not be materialized, most likely due to type erasure
// return the type variable of the deepest level
return inTypeTypeVar;
}
|
Tries to find a concrete value (Class, ParameterizedType etc. ) for a TypeVariable by
traversing the type hierarchy downwards. If a value could not be found it will return the
most bottom type variable in the hierarchy.
|
materializeTypeVariable
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
|
Apache-2.0
|
@PublicEvolving
public static boolean isRecord(Class<?> clazz) {
Class<?> superclass = clazz.getSuperclass();
return superclass != null
&& superclass.getName().equals("java.lang.Record")
&& (clazz.getModifiers() & Modifier.FINAL) != 0;
}
|
Determine whether the given class is a valid Java record.
@param clazz class to check
@return True if the class is a Java record
|
isRecord
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
|
Apache-2.0
|
T build() {
try {
return canonicalConstructor.newInstance(args);
} catch (Exception e) {
throw new RuntimeException("Could not instantiate record", e);
}
}
|
Builder class for incremental record construction.
|
build
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/runtime/JavaRecordBuilderFactory.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/runtime/JavaRecordBuilderFactory.java
|
Apache-2.0
|
public static <T> T copy(T from, Kryo kryo, TypeSerializer<T> serializer) {
try {
return kryo.copy(from);
} catch (KryoException ke) {
// Kryo could not copy the object --> try to serialize/deserialize the object
try {
byte[] byteArray = InstantiationUtil.serializeToByteArray(serializer, from);
return InstantiationUtil.deserializeFromByteArray(serializer, byteArray);
} catch (IOException ioe) {
throw new RuntimeException(
"Could not copy object by serializing/deserializing" + " it.", ioe);
}
}
}
|
Tries to copy the given record from using the provided Kryo instance. If this fails, then the
record from is copied by serializing it into a byte buffer and deserializing it from there.
@param from Element to copy
@param kryo Kryo instance to use
@param serializer TypeSerializer which is used in case of a Kryo failure
@param <T> Type of the element to be copied
@return Copied element
|
copy
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/runtime/KryoUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/runtime/KryoUtils.java
|
Apache-2.0
|
public static <T> T copy(T from, T reuse, Kryo kryo, TypeSerializer<T> serializer) {
try {
return kryo.copy(from);
} catch (KryoException ke) {
// Kryo could not copy the object --> try to serialize/deserialize the object
try {
byte[] byteArray = InstantiationUtil.serializeToByteArray(serializer, from);
return InstantiationUtil.deserializeFromByteArray(serializer, reuse, byteArray);
} catch (IOException ioe) {
throw new RuntimeException(
"Could not copy object by serializing/deserializing" + " it.", ioe);
}
}
}
|
Tries to copy the given record from using the provided Kryo instance. If this fails, then the
record from is copied by serializing it into a byte buffer and deserializing it from there.
@param from Element to copy
@param reuse Reuse element for the deserialization
@param kryo Kryo instance to use
@param serializer TypeSerializer which is used in case of a Kryo failure
@param <T> Type of the element to be copied
@return Copied element
|
copy
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/runtime/KryoUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/runtime/KryoUtils.java
|
Apache-2.0
|
public static void applyRegistrations(
Kryo kryo,
Collection<KryoRegistration> resolvedRegistrations,
int firstRegistrationId) {
int currentRegistrationId = firstRegistrationId;
Serializer<?> serializer;
for (KryoRegistration registration : resolvedRegistrations) {
serializer = registration.getSerializer(kryo);
if (serializer != null) {
kryo.register(registration.getRegisteredClass(), serializer, currentRegistrationId);
} else {
kryo.register(registration.getRegisteredClass(), currentRegistrationId);
}
// if Kryo already had a serializer for that type then it ignores the registration
if (kryo.getRegistration(currentRegistrationId) != null) {
currentRegistrationId++;
}
}
}
|
Apply a list of {@link KryoRegistration} to a Kryo instance. The list of registrations is
assumed to already be a final resolution of all possible registration overwrites.
<p>The registrations are applied in the given order and always specify the registration id,
using the given {@code firstRegistrationId} and incrementing it for each registration.
@param kryo the Kryo instance to apply the registrations
@param resolvedRegistrations the registrations, which should already be resolved of all
possible registration overwrites
@param firstRegistrationId the first registration id to use
|
applyRegistrations
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/runtime/KryoUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/runtime/KryoUtils.java
|
Apache-2.0
|
private static byte[] createPadding(
int originalSerializerLength, boolean padNullValueIfFixedLen) {
boolean padNullValue = originalSerializerLength > 0 && padNullValueIfFixedLen;
return padNullValue ? new byte[originalSerializerLength] : EMPTY_BYTE_ARRAY;
}
|
Serializer wrapper to add support of {@code null} value serialization.
<p>If the target serializer does not support {@code null} values of its type, you can use this
class to wrap this serializer. This is a generic treatment of {@code null} value serialization
which comes with the cost of additional byte in the final serialized value. The {@code
NullableSerializer} will intercept {@code null} value serialization case and prepend the target
serialized value with a boolean flag marking whether it is {@code null} or not.
<pre>{@code
TypeSerializer<T> originalSerializer = ...;
TypeSerializer<T> serializerWithNullValueSupport = NullableSerializer.wrap(originalSerializer);
// or
TypeSerializer<T> serializerWithNullValueSupport = NullableSerializer.wrapIfNullIsNotSupported(originalSerializer);
}
}</pre>
@param <T> type to serialize
|
createPadding
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/runtime/NullableSerializer.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/runtime/NullableSerializer.java
|
Apache-2.0
|
public static <T> TypeSerializer<T> wrapIfNullIsNotSupported(
@Nonnull TypeSerializer<T> originalSerializer, boolean padNullValueIfFixedLen) {
return checkIfNullSupported(originalSerializer)
? originalSerializer
: wrap(originalSerializer, padNullValueIfFixedLen);
}
|
This method tries to serialize {@code null} value with the {@code originalSerializer} and
wraps it in case of {@link NullPointerException}, otherwise it returns the {@code
originalSerializer}.
@param originalSerializer serializer to wrap and add {@code null} support
@param padNullValueIfFixedLen pad null value to preserve the fixed length of original
serializer
@return serializer which supports {@code null} values
|
wrapIfNullIsNotSupported
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/runtime/NullableSerializer.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/runtime/NullableSerializer.java
|
Apache-2.0
|
static void writeField(DataOutputView out, Field field) throws IOException {
Class<?> declaringClass = field.getDeclaringClass();
out.writeUTF(declaringClass.getName());
out.writeUTF(field.getName());
}
|
Writes a field to the given {@link DataOutputView}.
<p>This write method avoids Java serialization, by writing only the classname of the field's
declaring class and the field name. The written field can be read using {@link
#readField(DataInputView, ClassLoader)}.
@param out the output view to write to.
@param field the field to write.
|
writeField
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/runtime/PojoFieldUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/runtime/PojoFieldUtils.java
|
Apache-2.0
|
static Field readField(DataInputView in, ClassLoader userCodeClassLoader) throws IOException {
Class<?> declaringClass = InstantiationUtil.resolveClassByName(in, userCodeClassLoader);
String fieldName = in.readUTF();
return getField(fieldName, declaringClass);
}
|
Reads a field from the given {@link DataInputView}.
<p>This read methods avoids Java serialization, by reading the classname of the field's
declaring class and dynamically loading it. The field is also read by field name and obtained
via reflection.
@param in the input view to read from.
@param userCodeClassLoader the user classloader.
@return the read field.
|
readField
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/runtime/PojoFieldUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/runtime/PojoFieldUtils.java
|
Apache-2.0
|
@Nullable
static Field getField(String fieldName, Class<?> declaringClass) {
Class<?> clazz = declaringClass;
while (clazz != null) {
try {
Field field = clazz.getDeclaredField(fieldName);
field.setAccessible(true);
return field;
} catch (NoSuchFieldException e) {
clazz = clazz.getSuperclass();
}
}
return null;
}
|
Finds a field by name from its declaring class. This also searches for the field in super
classes.
@param fieldName the name of the field to find.
@param declaringClass the declaring class of the field.
@return the field.
|
getField
|
java
|
apache/flink
|
flink-core/src/main/java/org/apache/flink/api/java/typeutils/runtime/PojoFieldUtils.java
|
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/api/java/typeutils/runtime/PojoFieldUtils.java
|
Apache-2.0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.