Codota Logo
Convert
Code IndexAdd Codota to your IDE (free)

How to use
Convert
in
org.apache.beam.sdk.schemas.transforms

Best Java code snippets using org.apache.beam.sdk.schemas.transforms.Convert (Showing top 18 results out of 315)

  • Add the Codota plugin to your IDE and get smart completions
private void myMethod () {
List l =
  • Codota Iconnew LinkedList()
  • Codota IconCollections.emptyList()
  • Codota Iconnew ArrayList()
  • Smart code suggestions by Codota
}
origin: org.apache.beam/beam-sdks-java-core

/**
 * Convert a {@link PCollection}{@literal <Row>} into a {@link PCollection}{@literal <OutputT>}.
 *
 * <p>The output schema will be inferred using the schema registry. A schema must be registered
 * for this type, or the conversion will fail.
 */
public static <OutputT> PTransform<PCollection<Row>, PCollection<OutputT>> fromRows(
  Class<OutputT> clazz) {
 return to(clazz);
}
origin: org.apache.beam/beam-sdks-java-extensions-sql

@Override
public PCollection<Row> buildIOReader(PBegin begin) {
 assert begin.getPipeline() == upstream.getPipeline();
 return upstream.apply(Convert.toRows());
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testFromRows() {
 PCollection<POJO1> pojos =
   pipeline
     .apply(
       Create.of(EXPECTED_ROW1)
         .withSchema(
           EXPECTED_SCHEMA1,
           SerializableFunctions.identity(),
           SerializableFunctions.identity()))
     .apply(Convert.fromRows(POJO1.class));
 PAssert.that(pojos).containsInAnyOrder(new POJO1());
 pipeline.run();
}
origin: org.apache.beam/beam-sdks-java-core

/**
 * Convert a {@link PCollection}{@literal <Row>} into a {@link PCollection}{@literal <Row>}.
 *
 * <p>The output schema will be inferred using the schema registry. A schema must be registered
 * for this type, or the conversion will fail.
 */
public static <OutputT> PTransform<PCollection<Row>, PCollection<OutputT>> fromRows(
  TypeDescriptor<OutputT> typeDescriptor) {
 return to(typeDescriptor);
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testToRows() {
 PCollection<Row> rows = pipeline.apply(Create.of(new POJO1())).apply(Convert.toRows());
 PAssert.that(rows).containsInAnyOrder(EXPECTED_ROW1);
 pipeline.run();
}
origin: org.apache.beam/beam-sdks-java-core

/**
 * Convert a {@link PCollection}{@literal <InputT>} into a {@link PCollection}{@literal <Row>}.
 *
 * <p>The input {@link PCollection} must have a schema attached. The output collection will have
 * the same schema as the iput.
 */
public static <InputT> PTransform<PCollection<InputT>, PCollection<Row>> toRows() {
 return to(Row.class);
}
origin: org.apache.beam/beam-sdks-java-core

/**
 * Convert a {@link PCollection}{@literal <InputT>} to a {@link PCollection}{@literal <OutputT>}.
 *
 * <p>This function allows converting between two types as long as the two types have
 * <i>compatible</i> schemas. Two schemas are said to be <i>compatible</i> if they recursively
 * have fields with the same names, but possibly different orders.
 */
public static <InputT, OutputT> PTransform<PCollection<InputT>, PCollection<OutputT>> to(
  Class<OutputT> clazz) {
 return to(TypeDescriptor.of(clazz));
}
origin: org.apache.beam/beam-sdks-java-core

 @Test
 @Category(NeedsRunner.class)
 public void testGeneralConvert() {
  PCollection<POJO2> pojos =
    pipeline.apply(Create.of(new POJO1())).apply(Convert.to(POJO2.class));
  PAssert.that(pojos).containsInAnyOrder(new POJO2());
  pipeline.run();
 }
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testSelectAll() {
 PCollection<POJO1> pojos =
   pipeline
     .apply(Create.of(new POJO1()))
     .apply(Select.fieldAccess(FieldAccessDescriptor.withAllFields()))
     .apply(Convert.to(POJO1.class));
 PAssert.that(pojos).containsInAnyOrder(new POJO1());
 pipeline.run();
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testSimpleSelect() {
 PCollection<POJO1Selected> pojos =
   pipeline
     .apply(Create.of(new POJO1()))
     .apply(Select.fieldNames("field1", "field3"))
     .apply(Convert.to(POJO1Selected.class));
 PAssert.that(pojos).containsInAnyOrder(new POJO1Selected());
 pipeline.run();
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testComplexCast() throws Exception {
 Schema outputSchema = pipeline.getSchemaRegistry().getSchema(All2.class);
 PCollection<All2> pojos =
   pipeline
     .apply(Create.of(new All1()))
     .apply(Cast.narrowing(outputSchema))
     .apply(Convert.to(All2.class));
 PAssert.that(pojos).containsInAnyOrder(new All2());
 pipeline.run();
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testTypeNarrow() throws Exception {
 // narrowing is the opposite of widening
 Schema outputSchema = pipeline.getSchemaRegistry().getSchema(TypeWiden1.class);
 PCollection<TypeWiden1> pojos =
   pipeline
     .apply(Create.of(new TypeWiden2()))
     .apply(Cast.narrowing(outputSchema))
     .apply(Convert.to(TypeWiden1.class));
 PAssert.that(pojos).containsInAnyOrder(new TypeWiden1());
 pipeline.run();
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testTypeWiden() throws Exception {
 Schema outputSchema = pipeline.getSchemaRegistry().getSchema(TypeWiden2.class);
 PCollection<TypeWiden2> pojos =
   pipeline
     .apply(Create.of(new TypeWiden1()))
     .apply(Cast.widening(outputSchema))
     .apply(Convert.to(TypeWiden2.class));
 PAssert.that(pojos).containsInAnyOrder(new TypeWiden2());
 pipeline.run();
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testWeakedNullable() throws Exception {
 Schema outputSchema = pipeline.getSchemaRegistry().getSchema(Nullable2.class);
 PCollection<Nullable2> pojos =
   pipeline
     .apply(Create.of(new Nullable1()))
     .apply(Cast.narrowing(outputSchema))
     .apply(Convert.to(Nullable2.class));
 PAssert.that(pojos).containsInAnyOrder(new Nullable2());
 pipeline.run();
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testProjection() throws Exception {
 Schema outputSchema = pipeline.getSchemaRegistry().getSchema(Projection2.class);
 PCollection<Projection2> pojos =
   pipeline
     .apply(Create.of(new Projection1()))
     .apply(Cast.widening(outputSchema))
     .apply(Convert.to(Projection2.class));
 PAssert.that(pojos).containsInAnyOrder(new Projection2());
 pipeline.run();
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testIgnoreNullable() throws Exception {
 // ignoring nullable is opposite of weakening
 Schema outputSchema = pipeline.getSchemaRegistry().getSchema(Nullable1.class);
 PCollection<Nullable1> pojos =
   pipeline
     .apply(Create.of(new Nullable2()))
     .apply(Cast.narrowing(outputSchema))
     .apply(Convert.to(Nullable1.class));
 PAssert.that(pojos).containsInAnyOrder(new Nullable1());
 pipeline.run();
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testSelectNestedAll() {
 PCollection<POJO2NestedAll> pojos =
   pipeline
     .apply(Create.of(new POJO2()))
     .apply(
       Select.fieldAccess(
         FieldAccessDescriptor.create()
           .withNestedField("field2", FieldAccessDescriptor.withAllFields())))
     .apply(Convert.to(POJO2NestedAll.class));
 PAssert.that(pojos).containsInAnyOrder(new POJO2NestedAll());
 pipeline.run();
}
origin: org.apache.beam/beam-sdks-java-core

 @Test
 @Category(NeedsRunner.class)
 public void testSelectNestedPartial() {
  PCollection<POJO2NestedPartial> pojos =
    pipeline
      .apply(Create.of(new POJO2()))
      .apply(
        Select.fieldAccess(
          FieldAccessDescriptor.create()
            .withNestedField(
              "field2", FieldAccessDescriptor.withFieldNames("field1", "field3"))))
      .apply(Convert.to(POJO2NestedPartial.class));
  PAssert.that(pojos).containsInAnyOrder(new POJO2NestedPartial());
  pipeline.run();
 }
}
org.apache.beam.sdk.schemas.transformsConvert

Javadoc

A set of utilities for converting between different objects supporting schemas.

Most used methods

  • to
    Convert a PCollection to a PCollection .This function allows converting between two types as long as
  • toRows
    Convert a PCollection into a PCollection .The input PCollection must have a schema attached. The out
  • fromRows
    Convert a PCollection into a PCollection .The output schema will be inferred using the schema regist

Popular in Java

  • Start an intent from android
  • onCreateOptionsMenu (Activity)
  • addToBackStack (FragmentTransaction)
  • scheduleAtFixedRate (ScheduledExecutorService)
    Creates and executes a periodic action that becomes enabled first after the given initial delay, and
  • URL (java.net)
    A Uniform Resource Locator that identifies the location of an Internet resource as specified by RFC
  • Selector (java.nio.channels)
    A controller for the selection of SelectableChannel objects. Selectable channels can be registered w
  • Date (java.util)
    A specific moment in time, with millisecond precision. Values typically come from System#currentTime
  • TimerTask (java.util)
    A task that can be scheduled for one-time or repeated execution by a Timer.
  • UUID (java.util)
    UUID is an immutable representation of a 128-bit universally unique identifier (UUID). There are mul
  • CountDownLatch (java.util.concurrent)
    A synchronization aid that allows one or more threads to wait until a set of operations being perfor
Codota Logo
  • Products

    Search for Java codeSearch for JavaScript codeEnterprise
  • IDE Plugins

    IntelliJ IDEAWebStormAndroid StudioEclipseVisual Studio CodePyCharmSublime TextPhpStormVimAtomGoLandRubyMineEmacsJupyter
  • Company

    About UsContact UsCareers
  • Resources

    FAQBlogCodota Academy Plugin user guide Terms of usePrivacy policyJava Code IndexJavascript Code Index
Get Codota for your IDE now