/** * @param baseWriter RecordWriter to contain * @param context current TaskAttemptContext * @throws IOException * @throws InterruptedException */ public DynamicPartitionFileRecordWriterContainer( RecordWriter<? super WritableComparable<?>, ? super Writable> baseWriter, TaskAttemptContext context) throws IOException, InterruptedException { super(baseWriter, context); maxDynamicPartitions = jobInfo.getMaxDynamicPartitions(); dynamicPartCols = jobInfo.getPosOfDynPartCols(); if (dynamicPartCols == null) { throw new HCatException("It seems that setSchema() is not called on " + "HCatOutputFormat. Please make sure that method is called."); } this.baseDynamicSerDe = new HashMap<String, AbstractSerDe>(); this.baseDynamicWriters = new HashMap<String, RecordWriter<? super WritableComparable<?>, ? super Writable>>(); this.baseDynamicCommitters = new HashMap<String, org.apache.hadoop.mapred.OutputCommitter>(); this.dynamicContexts = new HashMap<String, org.apache.hadoop.mapred.TaskAttemptContext>(); this.dynamicObjectInspectors = new HashMap<String, ObjectInspector>(); this.dynamicOutputJobInfo = new HashMap<String, OutputJobInfo>(); this.HIVE_DEFAULT_PARTITION_VALUE = HiveConf.getVar(context.getConfiguration(), HiveConf.ConfVars.DEFAULTPARTITIONNAME); }
int maxDynamicPartitions = jobInfo.getMaxDynamicPartitions();
/** * @param baseWriter RecordWriter to contain * @param context current TaskAttemptContext * @throws IOException * @throws InterruptedException */ public DynamicPartitionFileRecordWriterContainer( RecordWriter<? super WritableComparable<?>, ? super Writable> baseWriter, TaskAttemptContext context) throws IOException, InterruptedException { super(baseWriter, context); maxDynamicPartitions = jobInfo.getMaxDynamicPartitions(); dynamicPartCols = jobInfo.getPosOfDynPartCols(); if (dynamicPartCols == null) { throw new HCatException("It seems that setSchema() is not called on " + "HCatOutputFormat. Please make sure that method is called."); } this.baseDynamicSerDe = new HashMap<String, SerDe>(); this.baseDynamicWriters = new HashMap<String, RecordWriter<? super WritableComparable<?>, ? super Writable>>(); this.baseDynamicCommitters = new HashMap<String, org.apache.hadoop.mapred.OutputCommitter>(); this.dynamicContexts = new HashMap<String, org.apache.hadoop.mapred.TaskAttemptContext>(); this.dynamicObjectInspectors = new HashMap<String, ObjectInspector>(); this.dynamicOutputJobInfo = new HashMap<String, OutputJobInfo>(); }
/** * @param baseWriter RecordWriter to contain * @param context current TaskAttemptContext * @throws IOException * @throws InterruptedException */ public DynamicPartitionFileRecordWriterContainer( RecordWriter<? super WritableComparable<?>, ? super Writable> baseWriter, TaskAttemptContext context) throws IOException, InterruptedException { super(baseWriter, context); maxDynamicPartitions = jobInfo.getMaxDynamicPartitions(); dynamicPartCols = jobInfo.getPosOfDynPartCols(); if (dynamicPartCols == null) { throw new HCatException("It seems that setSchema() is not called on " + "HCatOutputFormat. Please make sure that method is called."); } this.baseDynamicSerDe = new HashMap<String, SerDe>(); this.baseDynamicWriters = new HashMap<String, RecordWriter<? super WritableComparable<?>, ? super Writable>>(); this.baseDynamicCommitters = new HashMap<String, org.apache.hadoop.mapred.OutputCommitter>(); this.dynamicContexts = new HashMap<String, org.apache.hadoop.mapred.TaskAttemptContext>(); this.dynamicObjectInspectors = new HashMap<String, ObjectInspector>(); this.dynamicOutputJobInfo = new HashMap<String, OutputJobInfo>(); }
/** * @param baseWriter RecordWriter to contain * @param context current TaskAttemptContext * @throws IOException * @throws InterruptedException */ public DynamicPartitionFileRecordWriterContainer( RecordWriter<? super WritableComparable<?>, ? super Writable> baseWriter, TaskAttemptContext context) throws IOException, InterruptedException { super(baseWriter, context); maxDynamicPartitions = jobInfo.getMaxDynamicPartitions(); dynamicPartCols = jobInfo.getPosOfDynPartCols(); if (dynamicPartCols == null) { throw new HCatException("It seems that setSchema() is not called on " + "HCatOutputFormat. Please make sure that method is called."); } this.baseDynamicSerDe = new HashMap<String, SerDe>(); this.baseDynamicWriters = new HashMap<String, RecordWriter<? super WritableComparable<?>, ? super Writable>>(); this.baseDynamicCommitters = new HashMap<String, org.apache.hadoop.mapred.OutputCommitter>(); this.dynamicContexts = new HashMap<String, org.apache.hadoop.mapred.TaskAttemptContext>(); this.dynamicObjectInspectors = new HashMap<String, ObjectInspector>(); this.dynamicOutputJobInfo = new HashMap<String, OutputJobInfo>(); this.HIVE_DEFAULT_PARTITION_VALUE = HiveConf.getVar(context.getConfiguration(), HiveConf.ConfVars.DEFAULTPARTITIONNAME); }
/** * @param baseWriter RecordWriter to contain * @param context current TaskAttemptContext * @throws IOException * @throws InterruptedException */ public DynamicPartitionFileRecordWriterContainer( RecordWriter<? super WritableComparable<?>, ? super Writable> baseWriter, TaskAttemptContext context) throws IOException, InterruptedException { super(baseWriter, context); maxDynamicPartitions = jobInfo.getMaxDynamicPartitions(); dynamicPartCols = jobInfo.getPosOfDynPartCols(); if (dynamicPartCols == null) { throw new HCatException("It seems that setSchema() is not called on " + "HCatOutputFormat. Please make sure that method is called."); } this.baseDynamicSerDe = new HashMap<String, AbstractSerDe>(); this.baseDynamicWriters = new HashMap<String, RecordWriter<? super WritableComparable<?>, ? super Writable>>(); this.baseDynamicCommitters = new HashMap<String, org.apache.hadoop.mapred.OutputCommitter>(); this.dynamicContexts = new HashMap<String, org.apache.hadoop.mapred.TaskAttemptContext>(); this.dynamicObjectInspectors = new HashMap<String, ObjectInspector>(); this.dynamicOutputJobInfo = new HashMap<String, OutputJobInfo>(); this.HIVE_DEFAULT_PARTITION_VALUE = HiveConf.getVar(context.getConfiguration(), HiveConf.ConfVars.DEFAULTPARTITIONNAME); }
int maxDynamicPartitions = jobInfo.getMaxDynamicPartitions();
int maxDynamicPartitions = jobInfo.getMaxDynamicPartitions();
int maxDynamicPartitions = jobInfo.getMaxDynamicPartitions();
int maxDynamicPartitions = jobInfo.getMaxDynamicPartitions();