ó FRTc@svddlZddlmZddlmZddlmZddlmZddl m Z defd„ƒYZ dS( iÿÿÿÿN(tjson(tAWSQueryConnection(t RegionInfo(tJSONResponseError(t exceptionstDataPipelineConnectioncBsBeZdZdZdZdZdZdZeZ ie j d6e j d6e j d6e jd6e jd 6Zd „Zd „Zd „Zdd „Zd„Zddd„Zd„Zd„Zdd„Zdd„Zddd„Zd„Zdddd„Zd„Zddd„Z d„Z!dddd„Z"d„Z#d„Z$RS(s/ This is the AWS Data Pipeline API Reference . This guide provides descriptions and samples of the AWS Data Pipeline API. AWS Data Pipeline is a web service that configures and manages a data-driven workflow called a pipeline. AWS Data Pipeline handles the details of scheduling and ensuring that data dependencies are met so your application can focus on processing the data. The AWS Data Pipeline API implements two main sets of functionality. The first set of actions configure the pipeline in the web service. You call these actions to create a pipeline and define data sources, schedules, dependencies, and the transforms to be performed on the data. The second set of actions are used by a task runner application that calls the AWS Data Pipeline API to receive the next task ready for processing. The logic for performing the task, such as querying the data, running data analysis, or converting the data from one format to another, is contained within the task runner. The task runner performs the task assigned to it by the web service, reporting progress to the web service as it does so. When the task is done, the task runner reports the final success or failure of the task to the web service. AWS Data Pipeline provides an open-source implementation of a task runner called AWS Data Pipeline Task Runner. AWS Data Pipeline Task Runner provides logic for common data management scenarios, such as performing database queries and running data analysis using Amazon Elastic MapReduce (Amazon EMR). You can use AWS Data Pipeline Task Runner as your task runner, or you can write your own task runner to provide custom data management. The AWS Data Pipeline API uses the Signature Version 4 protocol for signing requests. For more information about how to sign a request with this protocol, see `Signature Version 4 Signing Process`_. In the code examples in this reference, the Signature Version 4 Request parameters are represented as AuthParams. s 2012-10-29s us-east-1s$datapipeline.us-east-1.amazonaws.comt DataPipelinetPipelineDeletedExceptiontInvalidRequestExceptiontTaskNotFoundExceptiontPipelineNotFoundExceptiontInternalServiceErrorcKsc|jddƒ}|s3t||j|jƒ}n|j|di}|dk r||dR?RtReportTaskRunnerHeartbeatRN(RRRR(Rt taskrunner_idRBR?R((sE/opt/freeware/lib/python2.7/site-packages/boto/datapipeline/layer1.pytreport_task_runner_heartbeatÜs     cCs:i|d6|d6|d6}|jdddtj|ƒƒS(sp Requests that the status of an array of physical or logical pipeline objects be updated in the pipeline. This update may not occur immediately, but is eventually consistent. The status that can be set depends on the type of object. :type pipeline_id: string :param pipeline_id: Identifies the pipeline that contains the objects. :type object_ids: list :param object_ids: Identifies an array of objects. The corresponding objects can be either physical or components, but not a mix of both types. :type status: string :param status: Specifies the status to be set on all the objects in `objectIds`. For components, this can be either `PAUSE` or `RESUME`. For instances, this can be either `CANCEL`, `RERUN`, or `MARK_FINISHED`. RR)tstatusRt SetStatusR(RRR(RR-RVRR((sE/opt/freeware/lib/python2.7/site-packages/boto/datapipeline/layer1.pyt set_statuss  cCs~i|d6|d6}|d k r-||ds