Codota Logo
AggregateProtos$AggregateRequest.hasInterpreterSpecificBytes
Code IndexAdd Codota to your IDE (free)

How to use
hasInterpreterSpecificBytes
method
in
org.apache.hadoop.hbase.protobuf.generated.AggregateProtos$AggregateRequest

Best Java code snippets using org.apache.hadoop.hbase.protobuf.generated.AggregateProtos$AggregateRequest.hasInterpreterSpecificBytes (Showing top 16 results out of 315)

  • Common ways to obtain AggregateProtos$AggregateRequest
private void myMethod () {
AggregateProtos$AggregateRequest a =
  • Codota IconAggregateProtos.AggregateRequest$Builder aggregateProtosAggregateRequest$Builder;aggregateProtosAggregateRequest$Builder.build()
  • Codota IconObject object;(org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest) object
  • Codota IconAggregateProtos.AggregateRequest$Builder aggregateProtosAggregateRequest$Builder;new org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest(aggregateProtosAggregateRequest$Builder)
  • Smart code suggestions by Codota
}
origin: apache/hbase

@SuppressWarnings("unchecked")
// Used server-side too by Aggregation Coprocesor Endpoint. Undo this interdependence. TODO.
ColumnInterpreter<T,S,P,Q,R> constructColumnInterpreterFromRequest(
  AggregateRequest request) throws IOException {
 String className = request.getInterpreterClassName();
 try {
  ColumnInterpreter<T,S,P,Q,R> ci;
  Class<?> cls = Class.forName(className);
  ci = (ColumnInterpreter<T, S, P, Q, R>) cls.getDeclaredConstructor().newInstance();
  if (request.hasInterpreterSpecificBytes()) {
   ByteString b = request.getInterpreterSpecificBytes();
   P initMsg = getParsedGenericInstance(ci.getClass(), 2, b);
   ci.initialize(initMsg);
  }
  return ci;
 } catch (ClassNotFoundException | InstantiationException | IllegalAccessException |
   NoSuchMethodException | InvocationTargetException e) {
  throw new IOException(e);
 }
}
origin: apache/hbase

@java.lang.Override
public int hashCode() {
 if (memoizedHashCode != 0) {
  return memoizedHashCode;
 }
 int hash = 41;
 hash = (19 * hash) + getDescriptorForType().hashCode();
 if (hasInterpreterClassName()) {
  hash = (37 * hash) + INTERPRETER_CLASS_NAME_FIELD_NUMBER;
  hash = (53 * hash) + getInterpreterClassName().hashCode();
 }
 if (hasScan()) {
  hash = (37 * hash) + SCAN_FIELD_NUMBER;
  hash = (53 * hash) + getScan().hashCode();
 }
 if (hasInterpreterSpecificBytes()) {
  hash = (37 * hash) + INTERPRETER_SPECIFIC_BYTES_FIELD_NUMBER;
  hash = (53 * hash) + getInterpreterSpecificBytes().hashCode();
 }
 hash = (29 * hash) + getUnknownFields().hashCode();
 memoizedHashCode = hash;
 return hash;
}
origin: apache/hbase

@java.lang.Override
public boolean equals(final java.lang.Object obj) {
 if (obj == this) {
  return true;
 }
 if (!(obj instanceof org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest)) {
  return super.equals(obj);
 }
 org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest other = (org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest) obj;
 boolean result = true;
 result = result && (hasInterpreterClassName() == other.hasInterpreterClassName());
 if (hasInterpreterClassName()) {
  result = result && getInterpreterClassName()
    .equals(other.getInterpreterClassName());
 }
 result = result && (hasScan() == other.hasScan());
 if (hasScan()) {
  result = result && getScan()
    .equals(other.getScan());
 }
 result = result && (hasInterpreterSpecificBytes() == other.hasInterpreterSpecificBytes());
 if (hasInterpreterSpecificBytes()) {
  result = result && getInterpreterSpecificBytes()
    .equals(other.getInterpreterSpecificBytes());
 }
 result = result &&
   getUnknownFields().equals(other.getUnknownFields());
 return result;
}
origin: apache/hbase

public Builder mergeFrom(org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest other) {
 if (other == org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest.getDefaultInstance()) return this;
 if (other.hasInterpreterClassName()) {
  bitField0_ |= 0x00000001;
  interpreterClassName_ = other.interpreterClassName_;
  onChanged();
 }
 if (other.hasScan()) {
  mergeScan(other.getScan());
 }
 if (other.hasInterpreterSpecificBytes()) {
  setInterpreterSpecificBytes(other.getInterpreterSpecificBytes());
 }
 this.mergeUnknownFields(other.getUnknownFields());
 return this;
}
origin: org.apache.hbase/hbase-endpoint

public Builder mergeFrom(org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest other) {
 if (other == org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest.getDefaultInstance()) return this;
 if (other.hasInterpreterClassName()) {
  bitField0_ |= 0x00000001;
  interpreterClassName_ = other.interpreterClassName_;
  onChanged();
 }
 if (other.hasScan()) {
  mergeScan(other.getScan());
 }
 if (other.hasInterpreterSpecificBytes()) {
  setInterpreterSpecificBytes(other.getInterpreterSpecificBytes());
 }
 this.mergeUnknownFields(other.getUnknownFields());
 return this;
}
origin: com.aliyun.hbase/alihbase-endpoint

public Builder mergeFrom(org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest other) {
 if (other == org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest.getDefaultInstance()) return this;
 if (other.hasInterpreterClassName()) {
  bitField0_ |= 0x00000001;
  interpreterClassName_ = other.interpreterClassName_;
  onChanged();
 }
 if (other.hasScan()) {
  mergeScan(other.getScan());
 }
 if (other.hasInterpreterSpecificBytes()) {
  setInterpreterSpecificBytes(other.getInterpreterSpecificBytes());
 }
 this.mergeUnknownFields(other.getUnknownFields());
 return this;
}
origin: harbby/presto-connectors

public Builder mergeFrom(org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest other) {
 if (other == org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest.getDefaultInstance()) return this;
 if (other.hasInterpreterClassName()) {
  bitField0_ |= 0x00000001;
  interpreterClassName_ = other.interpreterClassName_;
  onChanged();
 }
 if (other.hasScan()) {
  mergeScan(other.getScan());
 }
 if (other.hasInterpreterSpecificBytes()) {
  setInterpreterSpecificBytes(other.getInterpreterSpecificBytes());
 }
 this.mergeUnknownFields(other.getUnknownFields());
 return this;
}
origin: com.aliyun.hbase/alihbase-endpoint

@SuppressWarnings("unchecked")
// Used server-side too by Aggregation Coprocesor Endpoint. Undo this interdependence. TODO.
ColumnInterpreter<T,S,P,Q,R> constructColumnInterpreterFromRequest(
  AggregateRequest request) throws IOException {
 String className = request.getInterpreterClassName();
 try {
  ColumnInterpreter<T,S,P,Q,R> ci;
  Class<?> cls = Class.forName(className);
  ci = (ColumnInterpreter<T, S, P, Q, R>) cls.getDeclaredConstructor().newInstance();
  if (request.hasInterpreterSpecificBytes()) {
   ByteString b = request.getInterpreterSpecificBytes();
   P initMsg = getParsedGenericInstance(ci.getClass(), 2, b);
   ci.initialize(initMsg);
  }
  return ci;
 } catch (ClassNotFoundException | InstantiationException | IllegalAccessException |
   NoSuchMethodException | InvocationTargetException e) {
  throw new IOException(e);
 }
}
origin: harbby/presto-connectors

@SuppressWarnings("unchecked")
ColumnInterpreter<T,S,P,Q,R> constructColumnInterpreterFromRequest(
  AggregateRequest request) throws IOException {
 String className = request.getInterpreterClassName();
 Class<?> cls;
 try {
  cls = Class.forName(className);
  ColumnInterpreter<T,S,P,Q,R> ci = (ColumnInterpreter<T, S, P, Q, R>) cls.newInstance();
  if (request.hasInterpreterSpecificBytes()) {
   ByteString b = request.getInterpreterSpecificBytes();
   P initMsg = ProtobufUtil.getParsedGenericInstance(ci.getClass(), 2, b);
   ci.initialize(initMsg);
  }
  return ci;
 } catch (ClassNotFoundException e) {
  throw new IOException(e);
 } catch (InstantiationException e) {
  throw new IOException(e);
 } catch (IllegalAccessException e) {
  throw new IOException(e);
 }
}
origin: org.apache.hbase/hbase-endpoint

@java.lang.Override
public int hashCode() {
 if (memoizedHashCode != 0) {
  return memoizedHashCode;
 }
 int hash = 41;
 hash = (19 * hash) + getDescriptorForType().hashCode();
 if (hasInterpreterClassName()) {
  hash = (37 * hash) + INTERPRETER_CLASS_NAME_FIELD_NUMBER;
  hash = (53 * hash) + getInterpreterClassName().hashCode();
 }
 if (hasScan()) {
  hash = (37 * hash) + SCAN_FIELD_NUMBER;
  hash = (53 * hash) + getScan().hashCode();
 }
 if (hasInterpreterSpecificBytes()) {
  hash = (37 * hash) + INTERPRETER_SPECIFIC_BYTES_FIELD_NUMBER;
  hash = (53 * hash) + getInterpreterSpecificBytes().hashCode();
 }
 hash = (29 * hash) + getUnknownFields().hashCode();
 memoizedHashCode = hash;
 return hash;
}
origin: com.aliyun.hbase/alihbase-endpoint

@java.lang.Override
public int hashCode() {
 if (memoizedHashCode != 0) {
  return memoizedHashCode;
 }
 int hash = 41;
 hash = (19 * hash) + getDescriptorForType().hashCode();
 if (hasInterpreterClassName()) {
  hash = (37 * hash) + INTERPRETER_CLASS_NAME_FIELD_NUMBER;
  hash = (53 * hash) + getInterpreterClassName().hashCode();
 }
 if (hasScan()) {
  hash = (37 * hash) + SCAN_FIELD_NUMBER;
  hash = (53 * hash) + getScan().hashCode();
 }
 if (hasInterpreterSpecificBytes()) {
  hash = (37 * hash) + INTERPRETER_SPECIFIC_BYTES_FIELD_NUMBER;
  hash = (53 * hash) + getInterpreterSpecificBytes().hashCode();
 }
 hash = (29 * hash) + getUnknownFields().hashCode();
 memoizedHashCode = hash;
 return hash;
}
origin: harbby/presto-connectors

@java.lang.Override
public boolean equals(final java.lang.Object obj) {
 if (obj == this) {
  return true;
 }
 if (!(obj instanceof org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest)) {
  return super.equals(obj);
 }
 org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest other = (org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest) obj;
 boolean result = true;
 result = result && (hasInterpreterClassName() == other.hasInterpreterClassName());
 if (hasInterpreterClassName()) {
  result = result && getInterpreterClassName()
    .equals(other.getInterpreterClassName());
 }
 result = result && (hasScan() == other.hasScan());
 if (hasScan()) {
  result = result && getScan()
    .equals(other.getScan());
 }
 result = result && (hasInterpreterSpecificBytes() == other.hasInterpreterSpecificBytes());
 if (hasInterpreterSpecificBytes()) {
  result = result && getInterpreterSpecificBytes()
    .equals(other.getInterpreterSpecificBytes());
 }
 result = result &&
   getUnknownFields().equals(other.getUnknownFields());
 return result;
}
origin: harbby/presto-connectors

@java.lang.Override
public int hashCode() {
 if (memoizedHashCode != 0) {
  return memoizedHashCode;
 }
 int hash = 41;
 hash = (19 * hash) + getDescriptorForType().hashCode();
 if (hasInterpreterClassName()) {
  hash = (37 * hash) + INTERPRETER_CLASS_NAME_FIELD_NUMBER;
  hash = (53 * hash) + getInterpreterClassName().hashCode();
 }
 if (hasScan()) {
  hash = (37 * hash) + SCAN_FIELD_NUMBER;
  hash = (53 * hash) + getScan().hashCode();
 }
 if (hasInterpreterSpecificBytes()) {
  hash = (37 * hash) + INTERPRETER_SPECIFIC_BYTES_FIELD_NUMBER;
  hash = (53 * hash) + getInterpreterSpecificBytes().hashCode();
 }
 hash = (29 * hash) + getUnknownFields().hashCode();
 memoizedHashCode = hash;
 return hash;
}
origin: org.apache.hbase/hbase-endpoint

@java.lang.Override
public boolean equals(final java.lang.Object obj) {
 if (obj == this) {
  return true;
 }
 if (!(obj instanceof org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest)) {
  return super.equals(obj);
 }
 org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest other = (org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest) obj;
 boolean result = true;
 result = result && (hasInterpreterClassName() == other.hasInterpreterClassName());
 if (hasInterpreterClassName()) {
  result = result && getInterpreterClassName()
    .equals(other.getInterpreterClassName());
 }
 result = result && (hasScan() == other.hasScan());
 if (hasScan()) {
  result = result && getScan()
    .equals(other.getScan());
 }
 result = result && (hasInterpreterSpecificBytes() == other.hasInterpreterSpecificBytes());
 if (hasInterpreterSpecificBytes()) {
  result = result && getInterpreterSpecificBytes()
    .equals(other.getInterpreterSpecificBytes());
 }
 result = result &&
   getUnknownFields().equals(other.getUnknownFields());
 return result;
}
origin: com.aliyun.hbase/alihbase-endpoint

@java.lang.Override
public boolean equals(final java.lang.Object obj) {
 if (obj == this) {
  return true;
 }
 if (!(obj instanceof org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest)) {
  return super.equals(obj);
 }
 org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest other = (org.apache.hadoop.hbase.protobuf.generated.AggregateProtos.AggregateRequest) obj;
 boolean result = true;
 result = result && (hasInterpreterClassName() == other.hasInterpreterClassName());
 if (hasInterpreterClassName()) {
  result = result && getInterpreterClassName()
    .equals(other.getInterpreterClassName());
 }
 result = result && (hasScan() == other.hasScan());
 if (hasScan()) {
  result = result && getScan()
    .equals(other.getScan());
 }
 result = result && (hasInterpreterSpecificBytes() == other.hasInterpreterSpecificBytes());
 if (hasInterpreterSpecificBytes()) {
  result = result && getInterpreterSpecificBytes()
    .equals(other.getInterpreterSpecificBytes());
 }
 result = result &&
   getUnknownFields().equals(other.getUnknownFields());
 return result;
}
origin: org.apache.hbase/hbase-endpoint

@SuppressWarnings("unchecked")
// Used server-side too by Aggregation Coprocesor Endpoint. Undo this interdependence. TODO.
ColumnInterpreter<T,S,P,Q,R> constructColumnInterpreterFromRequest(
  AggregateRequest request) throws IOException {
 String className = request.getInterpreterClassName();
 try {
  ColumnInterpreter<T,S,P,Q,R> ci;
  Class<?> cls = Class.forName(className);
  ci = (ColumnInterpreter<T, S, P, Q, R>) cls.getDeclaredConstructor().newInstance();
  if (request.hasInterpreterSpecificBytes()) {
   ByteString b = request.getInterpreterSpecificBytes();
   P initMsg = getParsedGenericInstance(ci.getClass(), 2, b);
   ci.initialize(initMsg);
  }
  return ci;
 } catch (ClassNotFoundException | InstantiationException | IllegalAccessException |
   NoSuchMethodException | InvocationTargetException e) {
  throw new IOException(e);
 }
}
org.apache.hadoop.hbase.protobuf.generatedAggregateProtos$AggregateRequesthasInterpreterSpecificBytes

Javadoc

optional bytes interpreter_specific_bytes = 3;

Popular methods of AggregateProtos$AggregateRequest

  • <init>
  • getDefaultInstance
  • getDescriptorForType
  • getInterpreterClassName
    required string interpreter_class_name = 1; The request passed to the AggregateService consists of
  • getInterpreterClassNameBytes
    required string interpreter_class_name = 1; The request passed to the AggregateService consists of
  • getInterpreterSpecificBytes
    optional bytes interpreter_specific_bytes = 3;
  • getScan
    required .hbase.pb.Scan scan = 2;
  • getSerializedSize
  • getUnknownFields
  • hasInterpreterClassName
    required string interpreter_class_name = 1; The request passed to the AggregateService consists of
  • hasScan
    required .hbase.pb.Scan scan = 2;
  • initFields
  • hasScan,
  • initFields,
  • isInitialized,
  • makeExtensionsImmutable,
  • newBuilder,
  • parseUnknownField

Popular in Java

  • Running tasks concurrently on multiple threads
  • orElseThrow (Optional)
    Return the contained value, if present, otherwise throw an exception to be created by the provided s
  • getExternalFilesDir (Context)
  • setContentView (Activity)
  • Graphics2D (java.awt)
    This Graphics2D class extends the Graphics class to provide more sophisticated control overgraphics
  • Date (java.sql)
    A class which can consume and produce dates in SQL Date format. Dates are represented in SQL as yyyy
  • Timer (java.util)
    A facility for threads to schedule tasks for future execution in a background thread. Tasks may be s
  • AtomicInteger (java.util.concurrent.atomic)
    An int value that may be updated atomically. See the java.util.concurrent.atomic package specificati
  • Collectors (java.util.stream)
  • Modifier (javassist)
    The Modifier class provides static methods and constants to decode class and member access modifiers
Codota Logo
  • Products

    Search for Java codeSearch for JavaScript codeEnterprise
  • IDE Plugins

    IntelliJ IDEAWebStormAndroid StudioEclipseVisual Studio CodePyCharmSublime TextPhpStormVimAtomGoLandRubyMineEmacsJupyter
  • Company

    About UsContact UsCareers
  • Resources

    FAQBlogCodota Academy Plugin user guide Terms of usePrivacy policyJava Code IndexJavascript Code Index
Get Codota for your IDE now