Codota Logo
StreamExecutionEnvironment.fromElements
Code IndexAdd Codota to your IDE (free)

How to use
fromElements
method
in
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment

Best Java code snippets using org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.fromElements (Showing top 20 results out of 315)

Refine searchRefine arrow

  • StreamExecutionEnvironment.getExecutionEnvironment
  • Test.<init>
  • Common ways to obtain StreamExecutionEnvironment
private void myMethod () {
StreamExecutionEnvironment s =
  • Codota IconStreamExecutionEnvironment.getExecutionEnvironment()
  • Codota IconStreamExecutionEnvironment.createLocalEnvironment()
  • Smart code suggestions by Codota
}
origin: apache/flink

public static void main(String[] args) throws Exception {
  final ParameterTool params = ParameterTool.fromArgs(args);
  final Path inputFile = Paths.get(params.getRequired("inputFile"));
  final Path inputDir = Paths.get(params.getRequired("inputDir"));
  final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  env.setParallelism(1);
  env.registerCachedFile(inputFile.toString(), "test_data", false);
  env.registerCachedFile(inputDir.toString(), "test_dir", false);
  final Path containedFile;
  try (Stream<Path> files = Files.list(inputDir)) {
    containedFile = files.findAny().orElseThrow(() -> new RuntimeException("Input directory must not be empty."));
  }
  env.fromElements(1)
    .map(new TestMapFunction(
      inputFile.toAbsolutePath().toString(),
      Files.size(inputFile),
      inputDir.toAbsolutePath().toString(),
      containedFile.getFileName().toString()))
    .writeAsText(params.getRequired("output"), FileSystem.WriteMode.OVERWRITE);
  env.execute("Distributed Cache Via Blob Test Program");
}
origin: apache/flink

@Test
public void fromElementsWithBaseTypeTest1() {
  StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  env.fromElements(ParentClass.class, new SubClass(1, "Java"), new ParentClass(1, "hello"));
}
origin: apache/flink

/**
 * If expected values ever change double check that the change is not braking the contract of
 * {@link StreamingRuntimeContext#getOperatorUniqueID()} being stable between job submissions.
 */
@Test
public void testGetOperatorUniqueID() throws Exception {
  StreamExecutionEnvironment env = StreamExecutionEnvironment.createLocalEnvironment();
  env.fromElements(1, 2, 3)
    .map(new VerifyOperatorIDMapFunction("6c4f323f22da8fb6e34f80c61be7a689")).uid("42")
    .map(new VerifyOperatorIDMapFunction("3e129e83691e7737fbf876b47452acbc")).uid("44");
  env.execute();
}
origin: apache/flink

@Test(expected = IllegalArgumentException.class)
public void fromElementsWithBaseTypeTest2() {
  StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  env.fromElements(SubClass.class, new SubClass(1, "Java"), new ParentClass(1, "hello"));
}
origin: apache/flink

@Before
public void setUp() {
  StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  dataStream1 = env.fromElements("a1", "a2", "a3");
  dataStream2 = env.fromElements("a1", "a2");
  keySelector = element -> element;
  tsAssigner = TumblingEventTimeWindows.of(Time.milliseconds(1));
  joinFunction = (first, second) -> first + second;
}
origin: apache/flink

@Test
public void testPOJOnoHashCodeKeyRejection() {
  KeySelector<POJOWithoutHashCode, POJOWithoutHashCode> keySelector =
      new KeySelector<POJOWithoutHashCode, POJOWithoutHashCode>() {
        @Override
        public POJOWithoutHashCode getKey(POJOWithoutHashCode value) throws Exception {
          return value;
        }
      };
  StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  DataStream<POJOWithoutHashCode> input = env.fromElements(
      new POJOWithoutHashCode(new int[] {1, 2}));
  // adjust the rule
  expectedException.expect(InvalidProgramException.class);
  input.keyBy(keySelector);
}
origin: apache/flink

private static void runJob() throws Exception {
  StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  env.fromElements(1, 2, 3)
    .print();
  env.execute();
}
origin: apache/flink

@Test(expected = UnsupportedOperationException.class)
public void testForwardFailsLowToHighParallelism() throws Exception {
  StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  DataStream<Integer> src = env.fromElements(1, 2, 3);
  // this doesn't work because it goes from 1 to 3
  src.forward().map(new NoOpIntMap());
  env.execute();
}
origin: apache/flink

@Before
public void setUp() {
  StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  dataStream1 = env.fromElements("a1", "a2", "a3");
  dataStream2 = env.fromElements("a1", "a2");
  keySelector = element -> element;
  tsAssigner = TumblingEventTimeWindows.of(Time.milliseconds(1L));
  coGroupFunction = (CoGroupFunction<String, String, String>) (first, second, out) -> out.collect("");
}
origin: apache/flink

@Test(expected = UnsupportedOperationException.class)
public void testIncorrectParallelism() throws Exception {
  StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  DataStream<Integer> source = env.fromElements(1, 10);
  IterativeStream<Integer> iter1 = source.iterate();
  SingleOutputStreamOperator<Integer> map1 = iter1.map(noOpIntMap);
  iter1.closeWith(map1).print();
}
origin: apache/flink

private <K> void testKeyRejection(KeySelector<Tuple2<Integer[], String>, K> keySelector, TypeInformation<K> expectedKeyType) {
  StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  DataStream<Tuple2<Integer[], String>> input = env.fromElements(
      new Tuple2<>(new Integer[] {1, 2}, "barfoo")
  );
  Assert.assertEquals(expectedKeyType, TypeExtractor.getKeySelectorTypes(keySelector, input.getType()));
  // adjust the rule
  expectedException.expect(InvalidProgramException.class);
  expectedException.expectMessage(new StringStartsWith("Type " + expectedKeyType + " cannot be used as key."));
  input.keyBy(keySelector);
}
origin: apache/flink

@Test
public void testPOJOWithNestedArrayAndHashCodeWorkAround() {
  StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  DataStream<POJOWithHashCode> input = env.fromElements(
      new POJOWithHashCode(new int[] {1, 2}));
  input.keyBy(new KeySelector<POJOWithHashCode, POJOWithHashCode>() {
    @Override
    public POJOWithHashCode getKey(POJOWithHashCode value) throws Exception {
      return value;
    }
  }).addSink(new SinkFunction<POJOWithHashCode>() {
    @Override
    public void invoke(POJOWithHashCode value) throws Exception {
      Assert.assertEquals(value.getId(), new int[]{1, 2});
    }
  });
}
origin: apache/flink

  @Test
  public void testOperatorChainWithObjectReuseAndNoOutputOperators() throws Exception {
    StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
    env.getConfig().enableObjectReuse();
    DataStream<Integer> input = env.fromElements(1, 2, 3);
    input.flatMap(new FlatMapFunction<Integer, Integer>() {
      @Override
      public void flatMap(Integer value, Collector<Integer> out) throws Exception {
        out.collect(value << 1);
      }
    });
    env.execute();
  }
}
origin: apache/flink

@Test
public void testPrimitiveKeyAcceptance() throws Exception {
  StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  env.setParallelism(1);
  env.setMaxParallelism(1);
  DataStream<Integer> input = env.fromElements(new Integer(10000));
  KeyedStream<Integer, Object> keyedStream = input.keyBy(new KeySelector<Integer, Object>() {
    @Override
    public Object getKey(Integer value) throws Exception {
      return value;
    }
  });
  keyedStream.addSink(new SinkFunction<Integer>() {
    @Override
    public void invoke(Integer value) throws Exception {
      Assert.assertEquals(10000L, (long) value);
    }
  });
}
origin: apache/flink

@Test(expected = IllegalStateException.class)
public void testExecutionWithEmptyIteration() throws Exception {
  StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  DataStream<Integer> source = env.fromElements(1, 10).map(noOpIntMap);
  IterativeStream<Integer> iter1 = source.iterate();
  iter1.map(noOpIntMap).print();
  env.execute();
}
origin: apache/flink

@Test(expected = UnsupportedOperationException.class)
public void testDifferingParallelism() throws Exception {
  StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  // introduce dummy mapper to get to correct parallelism
  DataStream<Integer> source = env.fromElements(1, 10)
      .map(noOpIntMap);
  IterativeStream<Integer> iter1 = source.iterate();
  iter1.closeWith(iter1.map(noOpIntMap).setParallelism(parallelism / 2));
}
origin: apache/flink

@Test(expected = UnsupportedOperationException.class)
public void testClosingFromOutOfLoop() throws Exception {
  // this test verifies that we cannot close an iteration with a DataStream that does not
  // have the iteration in its predecessors
  StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  // introduce dummy mapper to get to correct parallelism
  DataStream<Integer> source = env.fromElements(1, 10).map(noOpIntMap);
  IterativeStream<Integer> iter1 = source.iterate();
  IterativeStream<Integer> iter2 = source.iterate();
  iter2.closeWith(iter1.map(noOpIntMap));
}
origin: apache/flink

@Test
public void testSelectAfterSideOutputIsForbidden() {
  StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  SingleOutputStreamOperator<String> processInput = env.fromElements("foo")
    .process(new DummyProcessFunction());
  processInput.getSideOutput(outputTag);
  try {
    processInput.split(Collections::singleton);
    Assert.fail("Should have failed early with an exception.");
  } catch (UnsupportedOperationException expected){
    // expected
  }
}
origin: apache/flink

@Test
public void testSideOutputAfterSelectIsForbidden() {
  StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  SingleOutputStreamOperator<String> processInput = env.fromElements("foo")
    .process(new DummyProcessFunction());
  processInput.split(Collections::singleton);
  try {
    processInput.getSideOutput(outputTag);
    Assert.fail("Should have failed early with an exception.");
  } catch (UnsupportedOperationException expected){
    // expected
  }
}
origin: apache/flink

@Test
public void testApplyWindowAllState() throws Exception {
  final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  env.setStreamTimeCharacteristic(TimeCharacteristic.IngestionTime);
  env.registerTypeWithKryoSerializer(File.class, JavaSerializer.class);
  DataStream<File> src = env.fromElements(new File("/"));
  SingleOutputStreamOperator<?> result = src
      .timeWindowAll(Time.milliseconds(1000))
      .apply(new AllWindowFunction<File, String, TimeWindow>() {
        @Override
        public void apply(TimeWindow window, Iterable<File> input, Collector<String> out) {}
      });
  validateListStateDescriptorConfigured(result);
}
org.apache.flink.streaming.api.environmentStreamExecutionEnvironmentfromElements

Javadoc

Creates a new data set that contains the given elements. The framework will determine the type according to the based type user supplied. The elements should be the same or be the subclass to the based type. The sequence of elements must not be empty. Note that this operation will result in a non-parallel data stream source, i.e. a data stream source with a degree of parallelism one.

Popular methods of StreamExecutionEnvironment

  • execute
  • getExecutionEnvironment
    Creates an execution environment that represents the context in which the program is currently execu
  • addSource
    Ads a data source with a custom type information thus opening a DataStream. Only in very special cas
  • getConfig
    Gets the config object.
  • enableCheckpointing
    Enables checkpointing for the streaming job. The distributed state of the streaming dataflow will be
  • setStreamTimeCharacteristic
    Sets the time characteristic for all streams create from this environment, e.g., processing time, ev
  • setParallelism
    Sets the parallelism for operations executed through this environment. Setting a parallelism of x he
  • setStateBackend
    Sets the state backend that describes how to store and checkpoint operator state. It defines both wh
  • createLocalEnvironment
    Creates a LocalStreamEnvironment. The local execution environment will run the program in a multi-th
  • fromCollection
    Creates a data stream from the given iterator.Because the iterator will remain unmodified until the
  • getCheckpointConfig
    Gets the checkpoint config, which defines values like checkpoint interval, delay between checkpoints
  • getParallelism
    Gets the parallelism with which operation are executed by default. Operations can individually overr
  • getCheckpointConfig,
  • getParallelism,
  • getStreamGraph,
  • setRestartStrategy,
  • socketTextStream,
  • readTextFile,
  • generateSequence,
  • clean,
  • getStreamTimeCharacteristic

Popular in Java

  • Making http post requests using okhttp
  • getSupportFragmentManager (FragmentActivity)
  • setContentView (Activity)
  • getApplicationContext (Context)
  • PrintStream (java.io)
    A PrintStream adds functionality to another output stream, namely the ability to print representatio
  • Runnable (java.lang)
    Represents a command that can be executed. Often used to run code in a different Thread.
  • ByteBuffer (java.nio)
    A buffer for bytes. A byte buffer can be created in either one of the following ways: * #allocate(i
  • MessageFormat (java.text)
    MessageFormat provides a means to produce concatenated messages in language-neutral way. Use this to
  • Deque (java.util)
    A linear collection that supports element insertion and removal at both ends. The name deque is shor
  • Notification (javax.management)
Codota Logo
  • Products

    Search for Java codeSearch for JavaScript codeEnterprise
  • IDE Plugins

    IntelliJ IDEAWebStormAndroid StudioEclipseVisual Studio CodePyCharmSublime TextPhpStormVimAtomGoLandRubyMineEmacsJupyter
  • Company

    About UsContact UsCareers
  • Resources

    FAQBlogCodota Academy Plugin user guide Terms of usePrivacy policyJava Code IndexJavascript Code Index
Get Codota for your IDE now