TerminateFragmentRequestProto.newBuilder().setQueryIdentifier( constructQueryIdentifierProto( taskAttemptId.getTaskID().getVertexID().getDAGId().getId())) .setFragmentIdentifierString(taskAttemptId.toString()).build(); communicator.sendTerminateFragment(request, nodeId.getHostname(), nodeId.getPort(),
private SubmitWorkRequestProto createRequest(int fragmentNumber, int numSelfAndUpstreamTasks, int numSelfAndUpstreamComplete, int dagStartTime, int attemptStartTime, int withinDagPriority, String dagName) throws IOException { ApplicationId appId = ApplicationId.newInstance(9999, 72); TezDAGID dagId = TezDAGID.getInstance(appId, 1); TezVertexID vId = TezVertexID.getInstance(dagId, 35); return LlapDaemonTestUtils.buildSubmitProtoRequest(fragmentNumber, appId.toString(), dagId.getId(), vId.getId(), dagName, dagStartTime, attemptStartTime, numSelfAndUpstreamTasks, numSelfAndUpstreamComplete, withinDagPriority, new Credentials()); }
private String constructLlapLogUrl(final TezTaskAttemptID attemptID, final String containerIdString, final boolean isDone, final String nmAddress) { String dagId = attemptID.getTaskID().getVertexID().getDAGId().toString(); String filename = JOINER.join(currentHiveQueryId, "-", dagId, ".log", (isDone ? ".done" : ""), "?nm.id=", nmAddress); String url = PATH_JOINER.join(timelineServerUri, "ws", "v1", "applicationhistory", "containers", containerIdString, "logs", filename); return url; }
private static TimelineEntity convertVertexReconfigureDoneEvent( VertexConfigurationDoneEvent event) { TimelineEntity atsEntity = new TimelineEntity(); atsEntity.setEntityId(event.getVertexID().toString()); atsEntity.setEntityType(EntityTypes.TEZ_VERTEX_ID.name()); atsEntity.addPrimaryFilter(ATSConstants.APPLICATION_ID, event.getVertexID().getDAGId().getApplicationId().toString()); atsEntity.addPrimaryFilter(EntityTypes.TEZ_DAG_ID.name(), event.getVertexID().getDAGId().toString()); TimelineEvent updateEvt = new TimelineEvent(); updateEvt.setEventType(HistoryEventType.VERTEX_CONFIGURE_DONE.name()); updateEvt.setTimestamp(event.getReconfigureDoneTime()); Map<String,Object> eventInfo = new HashMap<String, Object>(); if (event.getSourceEdgeProperties() != null && !event.getSourceEdgeProperties().isEmpty()) { Map<String, Object> updatedEdgeManagers = new HashMap<String, Object>(); for (Entry<String, EdgeProperty> entry : event.getSourceEdgeProperties().entrySet()) { updatedEdgeManagers.put(entry.getKey(), DAGUtils.convertEdgeProperty(entry.getValue())); } eventInfo.put(ATSConstants.UPDATED_EDGE_MANAGERS, updatedEdgeManagers); } eventInfo.put(ATSConstants.NUM_TASKS, event.getNumTasks()); updateEvt.setEventInfo(eventInfo); atsEntity.addEvent(updateEvt); atsEntity.addOtherInfo(ATSConstants.NUM_TASKS, event.getNumTasks()); return atsEntity; }
public TaskSpec constructTaskSpec(DAG dag, String vertexName, int numSplits, ApplicationId appId, int index) { Vertex vertex = dag.getVertex(vertexName); ProcessorDescriptor processorDescriptor = vertex.getProcessorDescriptor(); List<RootInputLeafOutput<InputDescriptor, InputInitializerDescriptor>> inputs = vertex.getInputs(); List<RootInputLeafOutput<OutputDescriptor, OutputCommitterDescriptor>> outputs = vertex.getOutputs(); Preconditions.checkState(inputs.size() == 1); Preconditions.checkState(outputs.size() == 1); List<InputSpec> inputSpecs = new ArrayList<>(); for (RootInputLeafOutput<InputDescriptor, InputInitializerDescriptor> input : inputs) { InputSpec inputSpec = new InputSpec(input.getName(), input.getIODescriptor(), 1); inputSpecs.add(inputSpec); } List<OutputSpec> outputSpecs = new ArrayList<>(); for (RootInputLeafOutput<OutputDescriptor, OutputCommitterDescriptor> output : outputs) { OutputSpec outputSpec = new OutputSpec(output.getName(), output.getIODescriptor(), 1); outputSpecs.add(outputSpec); } TezDAGID dagId = TezDAGID.getInstance(appId, 0); TezVertexID vertexId = TezVertexID.getInstance(dagId, 0); TezTaskID taskId = TezTaskID.getInstance(vertexId, index); TezTaskAttemptID taskAttemptId = TezTaskAttemptID.getInstance(taskId, 0); return new TaskSpec(taskAttemptId, dag.getName(), vertexName, numSplits, processorDescriptor, inputSpecs, outputSpecs, null); }
public static org.apache.hadoop.mapreduce.JobID toJobID(TezDAGID id) { return new JobID( String.valueOf(id.getApplicationId().getClusterTimestamp()), id.getId()); }
dagId = TezDAGID.fromString(proto.getDagId()); } catch (IllegalArgumentException e) { throw new IOException("Invalid dagId, summary records may be corrupted", e); if (dagCounter < dagId.getId()) { dagCounter = dagId.getId(); dagAppMaster.dagIDs.add(dagSummaryData.dagId.toString()); lastRecoveryFile = dagRecoveryFile; LOG.info("Trying to recover dag from recovery file" + ", dagId=" + lastInProgressDAG.toString() + ", dagRecoveryFile=" + dagRecoveryFile + ", len=" + fileStatus.getLen());
dagId = TezDAGID.getInstance(appAttemptID.getApplicationId(), dagCounter.incrementAndGet()); LOG.debug("JSON dump for submitted DAG, dagId=" + dagId.toString() + ", json=" + DAGUtils.generateSimpleJSONPlan(dagPB).toString());
case DAG_RECOVERED: String entityGroupId = numDagsPerGroup > 1 ? event.getDagID().getGroupId(numDagsPerGroup) : event.getDagID().toString(); return TimelineEntityGroupId.newInstance(event.getDagID().getApplicationId(), entityGroupId); case APP_LAUNCHED: case AM_LAUNCHED:
public static JobID toMRJobId(TezDAGID dagId) { return new JobID( Long.toString(dagId.getApplicationId().getClusterTimestamp()), dagId.getApplicationId().getId()); }
DAG getDAG(String dagIdStr) throws TezException { TezDAGID dagId; try { dagId = TezDAGID.fromString(dagIdStr); } catch (IllegalArgumentException e) { throw new TezException("Bad dagId: " + dagIdStr, e); } DAG currentDAG = getCurrentDAG(); if (currentDAG == null) { throw new TezException("No running dag at present"); } final String currentDAGIdStr = currentDAG.getID().toString(); if (!currentDAGIdStr.equals(dagIdStr)) { if (getAllDagIDs().contains(dagIdStr)) { if (LOG.isDebugEnabled()) { LOG.debug("Looking for finished dagId " + dagIdStr + " current dag is " + currentDAGIdStr); } throw new DAGNotRunningException("DAG " + dagIdStr + " Not running, current dag is " + currentDAGIdStr); } else { LOG.warn("Current DAGID : " + currentDAGIdStr + ", Looking for string (not found): " + dagIdStr + ", dagIdObj: " + dagId); throw new TezException("Unknown dagId: " + dagIdStr); } } return currentDAG; }
@Override public boolean equals(Object o) { if (this == o) { return true; } if(o == null) { return false; } if (o.getClass() == this.getClass()) { DagIdentifierImpl other = (DagIdentifierImpl) o; return this.dagId.equals(other.dagId); } else { return false; } }
Map<String, JSONObject> taskJsonMap = Maps.newHashMap(); Map<String, JSONObject> attemptJsonMap = Maps.newHashMap(); TezDAGID tezDAGID = TezDAGID.fromString(dagId); String userName = null; while (scanner.hasNext()) { String vertexName = entity; TezVertexID tezVertexID = TezVertexID.fromString(vertexName); if (!tezDAGID.equals(tezVertexID.getDAGId())) { LOG.warn(vertexName + " does not belong to " + tezDAGID); continue; String taskName = entity; TezTaskID tezTaskID = TezTaskID.fromString(taskName); if (!tezDAGID.equals(tezTaskID.getVertexID().getDAGId())) { LOG.warn(taskName + " does not belong to " + tezDAGID); continue; String taskAttemptName = entity; TezTaskAttemptID tezAttemptId = TezTaskAttemptID.fromString(taskAttemptName); if (!tezDAGID.equals(tezAttemptId.getTaskID().getVertexID().getDAGId())) { LOG.warn(taskAttemptName + " does not belong to " + tezDAGID); continue;
public void fromProto(DAGCommitStartedProto proto) { this.dagID = TezDAGID.fromString(proto.getDagId()); }
private static TimelineEntity convertDAGInitializedEvent(DAGInitializedEvent event) { TimelineEntity atsEntity = new TimelineEntity(); atsEntity.setEntityId(event.getDagID().toString()); atsEntity.setEntityType(EntityTypes.TEZ_DAG_ID.name()); TimelineEvent initEvt = new TimelineEvent(); initEvt.setEventType(HistoryEventType.DAG_INITIALIZED.name()); initEvt.setTimestamp(event.getInitTime()); atsEntity.addEvent(initEvt); atsEntity.addPrimaryFilter(ATSConstants.USER, event.getUser()); atsEntity.addPrimaryFilter(ATSConstants.APPLICATION_ID, event.getDagID().getApplicationId().toString()); atsEntity.addPrimaryFilter(ATSConstants.DAG_NAME, event.getDagName()); atsEntity.addOtherInfo(ATSConstants.INIT_TIME, event.getInitTime()); if (event.getVertexNameIDMap() != null) { Map<String, String> nameIdStrMap = new TreeMap<String, String>(); for (Entry<String, TezVertexID> entry : event.getVertexNameIDMap().entrySet()) { nameIdStrMap.put(entry.getKey(), entry.getValue().toString()); } atsEntity.addOtherInfo(ATSConstants.VERTEX_NAME_ID_MAPPING, nameIdStrMap); } return atsEntity; }
public static TezTaskAttemptID createTaskAttemptId( QueryIdentifierProto queryIdProto, int vertexIndex, int fragmentNum, int attemptNum) { // Come ride the API roller-coaster! return TezTaskAttemptID.getInstance( TezTaskID.getInstance( TezVertexID.getInstance( TezDAGID.getInstance( ConverterUtils.toApplicationId( queryIdProto.getApplicationIdString()), queryIdProto.getDagIndex()), vertexIndex), fragmentNum), attemptNum); }
public void getDagProgress() { setCorsHeaders(); if (!hasAccess()) { sendErrorResponse(HttpServletResponse.SC_UNAUTHORIZED, "Access denied for user: " + request().getRemoteUser(), null); return; } int dagID; try { dagID = getQueryParamInt(WebUIService.DAG_ID); } catch (NumberFormatException e) { sendErrorResponse(HttpServletResponse.SC_BAD_REQUEST, "Invalid dag id:", e); return; } DAG currentDAG = appContext.getCurrentDAG(); if (currentDAG == null || dagID != currentDAG.getID().getId()) { sendErrorResponse(HttpServletResponse.SC_NOT_FOUND, "Not current Dag: " + dagID, null); return; } Map<String, ProgressInfo> result = new HashMap<String, ProgressInfo>(); result.put(DAG_PROGRESS, new ProgressInfo(currentDAG.getID().toString(), currentDAG.getCompletedTaskProgress())); renderJSON(result); }
@Override public void run() { BaseHttpConnection httpConnection = null; try { URL baseURL = TezRuntimeUtils.constructBaseURIForShuffleHandlerDagComplete( nodeId.getHost(), shufflePort, dag.getApplicationId().toString(), dag.getId(), false); httpConnection = TezRuntimeUtils.getHttpConnection(true, baseURL, httpConnectionParams, "DAGDelete", jobTokenSecretManager); httpConnection.connect(); httpConnection.getInputStream(); } catch (Exception e) { LOG.warn("Could not setup HTTP Connection to the node " + nodeId.getHost() + " for dag delete. ", e); } finally { try { if (httpConnection != null) { httpConnection.cleanup(true); } } catch (IOException ioe) { LOG.warn("Encountered IOException for " + nodeId.getHost() + " during close. ", ioe); } } }
jobConf.setInt(MRInput.TEZ_MAPREDUCE_DAG_ATTEMPT_NUMBER, getContext().getDAGAttemptNumber()); TezDAGID tezDAGID = TezDAGID.getInstance(getContext().getApplicationId(), getContext().getDagIdentifier()); TezVertexID tezVertexID = TezVertexID.getInstance(tezDAGID, getContext().getTaskVertexIndex()); TezTaskID tezTaskID = TezTaskID.getInstance(tezVertexID, getContext().getTaskIndex()); TezTaskAttemptID tezTaskAttemptID = TezTaskAttemptID.getInstance(tezTaskID, getContext().getTaskAttemptNumber()); jobConf.set(MRInput.TEZ_MAPREDUCE_DAG_ID, tezDAGID.toString()); jobConf.set(MRInput.TEZ_MAPREDUCE_VERTEX_ID, tezVertexID.toString()); jobConf.set(MRInput.TEZ_MAPREDUCE_TASK_ID, tezTaskID.toString());