TypedBytesInput.readBytes
Code IndexAdd Codota to your IDE (free)

Best Java code snippets using org.apache.hadoop.hive.contrib.util.typedbytes.TypedBytesInput.readBytes (Showing top 14 results out of 315)

  • Common ways to obtain TypedBytesInput
private void myMethod () {
TypedBytesInput t =
  • DataInput in;new TypedBytesInput(in)
  • ThreadLocal threadLocal;(TypedBytesInput) threadLocal.get()
  • InputStream in;TypedBytesInput.get(new DataInputStream(in))
  • Smart code suggestions by Codota
}
origin: apache/hive

public BytesWritable readBytes(BytesWritable bw) throws IOException {
 byte[] bytes = in.readBytes();
 if (bw == null) {
  bw = new BytesWritable(bytes);
 } else {
  bw.set(bytes, 0, bytes.length);
 }
 return bw;
}
origin: apache/hive

public Writable readWritable(Writable writable) throws IOException {
 DataInputStream dis = null;
 try {
  ByteArrayInputStream bais = new ByteArrayInputStream(in.readBytes());
  dis = new DataInputStream(bais);
  String className = WritableUtils.readString(dis);
  if (writable == null) {
   try {
    Class<? extends Writable> cls = conf.getClassByName(className)
      .asSubclass(Writable.class);
    writable = (Writable) ReflectionUtils.newInstance(cls, conf);
   } catch (ClassNotFoundException e) {
    throw new IOException(e);
   }
  } else if (!writable.getClass().getName().equals(className)) {
   throw new IOException("wrong Writable class given");
  }
  writable.readFields(dis);
  dis.close();
  dis = null;
  return writable;
 } finally {
  IOUtils.closeStream(dis);
 }
}
origin: edu.berkeley.cs.shark/hive-contrib

public BytesWritable readBytes(BytesWritable bw) throws IOException {
 byte[] bytes = in.readBytes();
 if (bw == null) {
  bw = new BytesWritable(bytes);
 } else {
  bw.set(bytes, 0, bytes.length);
 }
 return bw;
}
origin: org.spark-project.hive/hive-contrib

public BytesWritable readBytes(BytesWritable bw) throws IOException {
 byte[] bytes = in.readBytes();
 if (bw == null) {
  bw = new BytesWritable(bytes);
 } else {
  bw.set(bytes, 0, bytes.length);
 }
 return bw;
}
origin: org.apache.hadoop.hive/hive-contrib

public BytesWritable readBytes(BytesWritable bw) throws IOException {
 byte[] bytes = in.readBytes();
 if (bw == null) {
  bw = new BytesWritable(bytes);
 } else {
  bw.set(bytes, 0, bytes.length);
 }
 return bw;
}
origin: com.github.hyukjinkwon/hive-contrib

public BytesWritable readBytes(BytesWritable bw) throws IOException {
 byte[] bytes = in.readBytes();
 if (bw == null) {
  bw = new BytesWritable(bytes);
 } else {
  bw.set(bytes, 0, bytes.length);
 }
 return bw;
}
origin: org.apache.hadoop.hive/hive-contrib

public Writable readWritable(Writable writable) throws IOException {
 ByteArrayInputStream bais = new ByteArrayInputStream(in.readBytes());
 DataInputStream dis = new DataInputStream(bais);
 String className = WritableUtils.readString(dis);
 if (writable == null) {
  try {
   Class<? extends Writable> cls = conf.getClassByName(className)
     .asSubclass(Writable.class);
   writable = (Writable) ReflectionUtils.newInstance(cls, conf);
  } catch (ClassNotFoundException e) {
   throw new IOException(e);
  }
 } else if (!writable.getClass().getName().equals(className)) {
  throw new IOException("wrong Writable class given");
 }
 writable.readFields(dis);
 return writable;
}
origin: org.apache.hadoop.hive/hive-contrib

public Buffer readBuffer(String tag) throws IOException {
 in.skipType();
 return new Buffer(in.readBytes());
}
origin: edu.berkeley.cs.shark/hive-contrib

public Buffer readBuffer(String tag) throws IOException {
 in.skipType();
 return new Buffer(in.readBytes());
}
origin: com.github.hyukjinkwon/hive-contrib

public Writable readWritable(Writable writable) throws IOException {
 DataInputStream dis = null;
 try {
  ByteArrayInputStream bais = new ByteArrayInputStream(in.readBytes());
  dis = new DataInputStream(bais);
  String className = WritableUtils.readString(dis);
  if (writable == null) {
   try {
    Class<? extends Writable> cls = conf.getClassByName(className)
      .asSubclass(Writable.class);
    writable = (Writable) ReflectionUtils.newInstance(cls, conf);
   } catch (ClassNotFoundException e) {
    throw new IOException(e);
   }
  } else if (!writable.getClass().getName().equals(className)) {
   throw new IOException("wrong Writable class given");
  }
  writable.readFields(dis);
  dis.close();
  dis = null;
  return writable;
 } finally {
  IOUtils.closeStream(dis);
 }
}
origin: org.spark-project.hive/hive-contrib

public Writable readWritable(Writable writable) throws IOException {
 DataInputStream dis = null;
 try {
  ByteArrayInputStream bais = new ByteArrayInputStream(in.readBytes());
  dis = new DataInputStream(bais);
  String className = WritableUtils.readString(dis);
  if (writable == null) {
   try {
    Class<? extends Writable> cls = conf.getClassByName(className)
      .asSubclass(Writable.class);
    writable = (Writable) ReflectionUtils.newInstance(cls, conf);
   } catch (ClassNotFoundException e) {
    throw new IOException(e);
   }
  } else if (!writable.getClass().getName().equals(className)) {
   throw new IOException("wrong Writable class given");
  }
  writable.readFields(dis);
  dis.close();
  dis = null;
  return writable;
 } finally {
  IOUtils.closeStream(dis);
 }
}
origin: edu.berkeley.cs.shark/hive-contrib

public Writable readWritable(Writable writable) throws IOException {
 DataInputStream dis = null;
 try {
  ByteArrayInputStream bais = new ByteArrayInputStream(in.readBytes());
  dis = new DataInputStream(bais);
  String className = WritableUtils.readString(dis);
  if (writable == null) {
   try {
    Class<? extends Writable> cls = conf.getClassByName(className)
      .asSubclass(Writable.class);
    writable = (Writable) ReflectionUtils.newInstance(cls, conf);
   } catch (ClassNotFoundException e) {
    throw new IOException(e);
   }
  } else if (!writable.getClass().getName().equals(className)) {
   throw new IOException("wrong Writable class given");
  }
  writable.readFields(dis);
  dis.close();
  dis = null;
  return writable;
 } finally {
  IOUtils.closeStream(dis);
 }
}
origin: org.apache.hadoop.hive/hive-contrib

 return new Buffer(readBytes());
} else if (code == Type.BYTE.code) {
 return readByte();
 return null;
} else if (50 <= code && code <= 200) { // application-specific typecodes
 return new Buffer(readBytes());
} else {
 throw new RuntimeException("unknown type");
origin: edu.berkeley.cs.shark/hive-contrib

 return new Buffer(readBytes());
} else if (code == Type.BYTE.code) {
 return readByte();
 return null;
} else if (50 <= code && code <= 200) { // application-specific typecodes
 return new Buffer(readBytes());
} else {
 throw new RuntimeException("unknown type");
org.apache.hadoop.hive.contrib.util.typedbytesTypedBytesInputreadBytes

Javadoc

Reads the bytes following a Type.BYTES code.

Popular methods of TypedBytesInput

  • <init>
  • get
    Get a thread-local typed bytes input for the supplied DataInput.
  • read
    Reads a typed bytes sequence and converts it to a Java object. The first byte is interpreted as a ty
  • readBool
    Reads the boolean following a Type.BOOL code.
  • readByte
    Reads the byte following a Type.BYTE code.
  • readDouble
    Reads the double following a Type.DOUBLE code.
  • readFloat
    Reads the float following a Type.FLOAT code.
  • readInt
    Reads the integer following a Type.INT code.
  • readList
    Reads the list following a Type.LIST code.
  • readLong
    Reads the long following a Type.LONG code.
  • readMap
    Reads the map following a Type.MAP code.
  • readMapHeader
    Reads the header following a Type.MAP code.
  • readMap,
  • readMapHeader,
  • readRaw,
  • readRawBool,
  • readRawByte,
  • readRawBytes,
  • readRawDouble,
  • readRawFloat,
  • readRawInt

Popular in Java

  • Reactive rest calls using spring rest template
  • getOriginalFilename (MultipartFile)
    Return the original filename in the client's filesystem.This may contain path information depending
  • setScale (BigDecimal)
  • getSystemService (Context)
  • RandomAccessFile (java.io)
    Allows reading from and writing to a file in a random-access manner. This is different from the uni-
  • Timestamp (java.sql)
    A Java representation of the SQL TIMESTAMP type. It provides the capability of representing the SQL
  • Locale (java.util)
    Locale represents a language/country/variant combination. Locales are used to alter the presentatio
  • Executor (java.util.concurrent)
    An object that executes submitted Runnable tasks. This interface provides a way of decoupling task s
  • JLabel (javax.swing)
  • Logger (org.apache.log4j)
    This is the central class in the log4j package. Most logging operations, except configuration, are d

For IntelliJ IDEA,
Android Studio or Eclipse

  • Search for JavaScript code betaCodota IntelliJ IDEA pluginCodota Android Studio pluginCode IndexSign in
  • EnterpriseFAQAboutBlogContact Us
  • Plugin user guideTerms of usePrivacy policyCodeboxFind Usages
Add Codota to your IDE (free)