JFrog Help Center

Our new portal is coming soon!
Documentation + Knowledge Base





JFrog帮助中心——一个新的知识经验coming your way soon!



Overview

This example demonstrates how simple pipelines can be defined and executed with JFrog Pipelines.Anexample Pipelines DSL is used to show how to use integrations, resources, and steps to construct a simple, automated workflow.

This pipeline example demonstrates the following:

  • Creating aGitHub Integration.
  • Adding aPipeline Source.
  • Creating aGitRepotrigger, which will trigger a step when the contents of the source control repository change.
  • UsinginputResourcesand inputSteps to set up dependencies between steps and resources.
  • Usingenvironment variables(e.g.$res_myFirstRepo_commitSha) to extract information frominputResources.
  • Usingrun stateto pass information to downstream steps of a run.
  • Usingpipeline stateto pass information to subsequent runs.
  • Connecting dependent pipelines through resources
Page Contents

Successful runs of the pipeline in this quickstart look like this:

PIPE_HelloWorld_1


PIPE_HelloWorld_AltImage


Before you Begin

Before trying this quickstart, ensure that you have:

  • A GitHub account. This is required for forking the example repository.
  • AJFrog PlatformCloud account, or self-hostedJFrog Pipelines.
  • At least one node pool. This is the set of nodes that all pipeline steps will execute in. For more information, seeManaging Pipelines Node Pools.
  • A user account in Artifactory with deploy permissions to at least one binary repository.

Running this pipeline

Perform the following steps to run this pipeline:

  1. Fork repository

    The Pipelines DSL for this example is available in theJFrog GitHub repositoryrepository in theJFrogGitHub account.

    The DSL file is a yaml file that contains the pipeline definitions. This example uses two YAML files:
    • jfrog-pipelines-hello-world.yml, which contains the declarations for the pipelines in this example
    • values.yml, which contains the values required for thejfrog-pipelines-hello-world.ymlfile.

    For a full breakup of all the resources, pipelines and steps used in the yml file, see thejfrog-pipelines-hello-world.ymlsection below.

    Fork this repositoryto your account or organization. This is important since you need admin access to repositories that are used as Pipeline Sources or GitRepo resources, in order to add webhooks to these repositories and listen for change events.

  2. Sign in


    Sign in to JFrog Platform with your Artifactory credentials.

  3. Add integration


    Go toAdministration|管道|Integrationstoadd one integration:
    GitHub Integration: This integration is used to add the Pipeline source, as well as the GitRepo resource defined invalues.yml, to connect Github to Pipelines. Write down the GitHub integration name.

  4. Update values.yml


    The pipelines configuration is available in the values.yml file. Edit this file in your fork of this repo and replace the following:

    Tag

    Description

    Example

    gitProvider Provide the name of the Github integration you added in the previous step. gitProvider: my_github
    path Provide the path to your fork of this repository. path: myuser/jfrog-pipelines-hello-world

    All pipeline and resource names are global across your JFrog Pipelines project.The names of your steps and resources need to be unique within the JFrog Pipelines project.

  5. Add pipeline source

    The Pipeline Source represents the git repository where our pipelines definition files are stored. A pipeline source connects to the repository through anintegration, which we added in step 3.
    1. In your left navigation bar, go toAdministration |Pipeline | Pipeline Sources. Click onAdd a Pipeline Sourceand then chooseFrom YAML. Follow instructions toadd a Pipeline Source.This automatically adds your configuration to the platform and pipelines are created based on your YAML.
    2. After your pipeline source syncs successfully, navigate to管道| My Pipelinesin the left navbar to view the newly added pipeline.In this example,my_first_pipelineandmy_second_pipelineare the names of our pipeline.


    3. Click the name of the pipeline.Thisrenders a real-time, interactive, diagram of the pipeline and the results of its most current run.

  6. Execute the pipeline


    You can now commit to the repo to trigger your pipeline, or trigger it manually through the UI. The steps in the pipeline execute in sequence.


    Once the pipeline has completed, a new run is listed.


    Successful run of the first pipeline triggers the execution of the second pipeline:



jfrog-pipelines-hello-world.yml

Thejfrog-pipelines-hello-world.ymlfile is made up of resources, pipelines and steps, as shown below:

2022世界杯阿根廷预选赛赛程

This example uses the following types of2022世界杯阿根廷预选赛赛程:

GitRepo

AGitReporesourceis used to connect JFrog Pipelines to a source control repository. Adding it creates a webhook to the repo so that future commits will automatically create a new version with the webhook payload.

2022世界杯阿根廷预选赛赛程
- name: myFirstRepo type: GitRepo configuration: # SCM integration where the repository is located gitProvider: {{ Values.myRepo.gitProvider }} # Repository path, including org name/repo name path: {{ Values.myRepo.path }} #manishas-jfrog/jfrog-pipelines-hello-world # replace with your repository name branches: # Specifies which branches will trigger dependent steps include: master

Tag

Description

Required/Optional

name

myFirstRepois the name of theGitReporesource pointing to the repository containing the yaml files and other source code required to build the image.

This name is used to refer to the resource in steps, and must be unique across all repositories in your JFrog Pipelines environment.

Required

gitProvider

The name of theGitHub Integration. Its value is retrieved from the values.yml file. Required
path The path of the repository from the integration root. Its value is retrieved from the values.yml file. Required
branches
  • include-- (optional) Regular expression to include branches from the repo
  • exclude-- (optional) Regular expression to exclude branches from the repo

Theinclude: mastertag indicates that the GitRepo resource is listening to the master branch.

Optional



Defining aGitReporesourceacts as the trigger for the pipeline.

PropertyBag

APropertyBagresource is used to pass information from one pipeline to another, and to provide environment variables to a step in the format of a resource.

A PropertyBag resource can have any strings as properties, which will then be available as environment variables when the key is an input to a step.When it is an output, steps can change the values of properties or add new ones.

2022世界杯阿根廷预选赛赛程
- name: myPropertyBag type: PropertyBag configuration: commitSha: 1 runID: 1

Tag

Description

Required/Optional
name

myPropertyBagis the name of thePropertyBagresource, which isthe metadata associated with the build in Artifactory.

Required

A property for the PropertyBag resource. The tag should be a valid variable name (Bash or PowerShell) for the steps where it is to be an input or output and the value a string. At least one is required, multiple properties are allowed.

Required

管道

This example uses two pipelines:

  • my_first_pipelineis the name of the first pipeline, consisting of 3 linear steps. The last step outputs a resource of type PropertyBag.
  • my_second_pipelineis the name of the second pipelines, which contains a single step triggered by the PropertyBag resource updated by the first pipeline

Steps

Bothmy_first_pipelineandmy_second_pipelinepipelines contain the followingsteptype:

Bash

Bashis a generic step type that enables executing any shell command. This general-purpose step can be used to execute any action that can be scripted, even with tools and services that haven't been integrated with JFrog Pipelines. This is the most versatile of the steps while taking full advantage of what the lifecycle offers.

In our example:

  • p1_s1,p1_s2,p1_s3are the names of the Bash steps in themy_first_pipelinepipeline.
  • p2_s1is the name of the Bash step in themy_second_pipelinepipeline.

The steps are defined so that they execute in an interdependent sequence. This means that each step's execution is triggered by the successful completion of a prior, prerequisite step (or steps). In our example, step 1's (p1_s1) completion triggers the execution of step 2 (p1_s2), completion of step 2 triggers execution of step 3 (p1_s3), and so on until all steps in the pipeline are executed.

Steps
管道:名称:my_first_pipeline步骤:-名字: p1_s1 type: Bash configuration: inputResources: # Sets up step to be triggered when there are commit events to myFirstRepo - name: myFirstRepo execution: onExecute: # Data from input resources is available as env variables in the step - echo $res_myFirstRepo_commitSha # The next two commands add variables to run state, which is available to all downstream steps in this run # Run state documentation: //m.si-fil.com/confluence/display/JFROG/Creating+Stateful+Pipelines#CreatingStatefulPipelines-RunState - add_run_variables current_runid=$run_id - add_run_variables commitSha=$res_myFirstRepo_commitSha # This variable is written to pipeline state in p1_s3. # So this will be empty during first run and will be set to prior run number in subsequent runs # Pipeline state documentation: //m.si-fil.com/confluence/display/JFROG/Creating+Stateful+Pipelines#CreatingStatefulPipelines-PipelineState - echo "Previous run ID is $prev_runid" - name: p1_s2 type: Bash configuration: inputSteps: - name: p1_s1 execution: onExecute: # Demonstrates the availability of an env variable written to run state during p1_s1 - echo $current_runid - name: p1_s3 type: Bash configuration: inputSteps: - name: p1_s2 outputResources: - name: myPropertyBag execution: onExecute: - echo $current_runid # Writes current run number to pipeline state - add_pipeline_variables prev_runid=$run_id # Uses an utility function to update the output resource with the commitSha that triggered this run # Dependent pipelines can be configured to trigger when this resource is updated # Utility functions documentation: //m.si-fil.com/confluence/display/JFROG/Pipelines+Utility+Functions - write_output myPropertyBag commitSha=$commitSha runID=$current_runid - name: my_second_pipeline steps: - name: p2_s1 type: Bash configuration: inputResources: # Sets up step to be triggered when myPropertyBag is updated - name: myPropertyBag execution: onExecute: # Retrieves the commitSha from input resource - echo "CommitSha is $res_myPropertyBag_commitSha"

configuration

Specifiesalloptional configuration selections for the step's execution environment.

Tag

Description of usage

Required/Optional

inputResources

命名资源的集合2022世界杯阿根廷预选赛赛程that will be used by a step as inputs.

In this example:

  • Stepp1_s1, in the first pipeline, is triggered when there are commit events tomyFirstRepo, which is the name of the GitRepo resource.
  • Stepp2_s1, in the second pipeline, is triggered when themyPropertyBagresource is updated.
Optional
inputSteps

A collection of named steps whose completion will trigger execution of this step.

In this example:

  • Completion of step p1_s1 triggers the execution of step p1_s2.
    and
  • Completion of step p1_s2 triggers the execution of step p1_s3.
Optional
outputResources

命名资源的集合2022世界杯阿根廷预选赛赛程that will be generated or changed by a step.

Optional

execution

Declare sets of shell command sequences to perform for different execution phases:

Tag

Description of usage

Required/Optional

onExecute Main commands to execute for the step Optional



  • No labels
Copyright © 2023 JFrog Ltd.