Tekton Pipelines in OpenShift – A Practical Guide
Follow the OpenShift Tips And Tricks channel on WhatsApp
Introduction to Tekton Pipelines
What is Tekton?
Tekton is a powerful, Kubernetes-native open-source framework designed for building robust CI/CD (Continuous Integration and Continuous Delivery) systems. It provides a standardized and scalable way to define workflows as code using YAML configuration files. By treating CI/CD pipelines as first-class Kubernetes resources, Tekton empowers developers to automate build, test, and deployment processes directly within the Kubernetes ecosystem.
Tekton’s modular architecture allows users to:
· Define reusable Tasks and Pipelines
· Orchestrate complex workflows
· Leverage standard Kubernetes tools (kubectl, oc)
· Integrate with GitOps and other DevOps practices
Why Tekton in OpenShift?
OpenShift, Red Hat’s enterprise Kubernetes platform, seamlessly integrates Tekton as OpenShift Pipelines. This native integration allows developers and DevOps teams to:
· Create, manage, and monitor pipelines using the OpenShift Console and CLI.
· Use Kubernetes-native Custom Resource Definitions (CRDs) to define CI/CD components.
· Securely manage secrets, environment variables, and credentials with built-in OpenShift features.
· Build GitOps-ready automation using Tekton Triggers and OpenShift Git repositories.
With Tekton on OpenShift, organizations can implement a consistent and cloud-native CI/CD strategy that is scalable, declarative, and aligned with Kubernetes best practices.
Key Concepts of Tekton Pipelines
Understanding the foundational components of Tekton is essential to designing effective and modular CI/CD pipelines. Tekton breaks down workflows into smaller, reusable elements that work seamlessly within Kubernetes.
1. Task
A Task is the fundamental building block in Tekton. It represents a single, reusable unit of work—such as cloning a Git repository, building a container image, or running tests. Each Task is composed of one or more Steps, where each step runs in its own container. Tasks can be stored in a catalog and reused across different pipelines.
Example: A Maven build task with steps to compile and package a Java application.
2. Pipeline
A Pipeline is a higher-level resource that defines a sequence of Tasks. It determines the execution order and dependencies between tasks. Pipelines allow conditional execution, parallel processing, and data sharing between tasks via parameters and workspaces.
Example: A pipeline with tasks to build, test, and deploy an application in sequence.
3. PipelineRun
A PipelineRun is the actual execution instance of a Pipeline. It binds parameter values, workspaces, and resources to run the defined sequence of tasks. Each time you run a CI/CD process, a new PipelineRun is created.
Think of it as pressing “Run” on a defined workflow with specific inputs.
4. TaskRun
A TaskRun is the execution instance of a Task. When a pipeline is executed, Tekton creates one TaskRun for each task. It tracks status, logs, and results for that specific run.
Each TaskRun corresponds to one task execution with its own logs and outcome.
5. Workspace
Workspaces are shared volumes that allow data to persist across different steps within a task or between tasks in a pipeline. Workspaces are essential for passing artifacts like source code or build outputs from one task to another.
Example: A cloned Git repository stored in a workspace can be accessed by a build and deploy task.
6. Parameters & Results
Parameters: Used to pass configurable values into tasks and pipelines. These make the pipeline dynamic and reusable.
Results: Used to share output values from one task to another. They enable decision-making and chaining logic across the pipeline.
Example: A test task can output a result (pass/fail), which a deploy task uses to decide whether to proceed.
Installing OpenShift Pipelines
To use Tekton Pipelines in OpenShift, you need to install the OpenShift Pipelines Operator, which provides the Tekton framework integrated with the OpenShift ecosystem.
Prerequisites
Before you begin the installation, ensure the following prerequisites are met:
OpenShift 4.x Cluster: A running OpenShift Container Platform version 4.x or above.
Cluster Administrator Access: Required to install operators and manage cluster-wide resources.
Internet Access: To fetch the operator and related container images from Red Hat and Tekton registries.
Installation Steps
Follow these steps to install OpenShift Pipelines using the OpenShift web console:
Login to the OpenShift Console Access your OpenShift Web Console using a cluster administrator account.
Navigate to OperatorHub In the left sidebar, go to Operators → OperatorHub.
Search for the OpenShift Pipelines Operator Use the search bar to find OpenShift Pipelines. This operator enables Tekton functionality within OpenShift.
Install the Operator Click on the operator tile. Choose either a specific namespace or select All Namespaces on the Cluster for a global install. Select the stable channel (recommended). Click Install.
Wait for the Operator to Deploy The Operator Lifecycle Manager (OLM) will handle the installation. Wait until the Status changes to Succeeded.
Once installed, the OpenShift Pipelines Operator adds Tekton CRDs to the cluster, allowing you to create and manage Tekton-based CI/CD resources directly from the console or via CLI (oc or kubectl).
Tip: You can verify the installation by navigating to Pipelines → Pipelines in the console or running oc get pipelines.tekton.dev in the terminal.
Tekton CRDs Overview
When the OpenShift Pipelines Operator is installed, it brings in a set of Custom Resource Definitions (CRDs) that extend the Kubernetes API. These CRDs define Tekton’s core building blocks and enable the platform to treat CI/CD pipelines as native Kubernetes objects.
All Tekton resources are written in YAML and managed just like standard Kubernetes resources using tools like or .
Key Tekton CRDs Installed in OpenShift Pipelines
1. Task
Defines a reusable set of sequential steps (containers) that perform a specific job such as building code or running tests.
2. TaskRun
Represents a single execution (run) of a Task. It records the execution details, including inputs, outputs, logs, and status.
3. Pipeline
Defines an ordered collection of tasks to be executed together as part of a CI/CD workflow. It supports parallel execution, conditional logic, and parameter passing.
4. PipelineRun
Represents an execution of a Pipeline. It provides runtime configuration (e.g., parameter values, workspace bindings) and monitors the pipeline’s progress and results.
5. PipelineResource (Deprecated)
Originally used to represent external resources like Git repositories or image registries. This CRD is now deprecated in favor of parameters and workspaces.
6. Tekton Triggers CRDs
These enable event-driven pipeline execution:
· TriggerTemplate – Defines the blueprint for a or .
· TriggerBinding – Maps event payload data (e.g., from a webhook) to parameters.
· EventListener – Listens for HTTP events (e.g., GitHub webhook) and triggers pipelines accordingly.
Tekton CRDs form the foundation for defining and running CI/CD pipelines on OpenShift. Because they are Kubernetes-native, you can version, audit, and manage them like any other Kubernetes object, making them ideal for GitOps and automation workflows.
Creating Your First Task
Before building complex pipelines, it’s important to understand how to create and execute a Task, the foundational unit in Tekton. A Task encapsulates one or more containerized steps that perform a specific operation.
In this section, we’ll create a simple “Hello World” task to get hands-on with Tekton.
Step 1: Define the Task
The following YAML defines a task that prints a message using a basic Ubuntu container:
Explanation:
· kind: Task – Declares this as a Tekton Task resource.
· steps – A list of individual actions (containers). Each step runs in its own container.
· image: ubuntu – The base container image used to run the script.
· script – Bash script that gets executed within the container.
Save the above YAML as .
Step 2: Create a TaskRun
To execute the task, define a that references the task:
This YAML tells Tekton to run the once.
Save this file as .
Step 3: Apply the Resources
Use the OpenShift CLI to create and run the task:
You can monitor the execution status using:
Output:
Once completed, this confirms that Tekton is installed and functioning correctly on your OpenShift cluster.
Creating a Multi-Step Task
Tekton Tasks are not limited to single-step operations. You can define multiple steps within a task to perform complex operations sequentially — all within the same Pod. Each step runs in a separate container but shares the same volume and working directory.
In this example, we'll define a task that builds a Java application using Maven.
Multi-Step Task Example: Maven Build
Explanation:
· image: maven:3.8.1-jdk-11 Uses an official Maven Docker image with JDK 11 installed.
· workingDir: /workspace/source Refers to a shared workspace directory, typically used to mount source code repositories or shared files.
· script: Runs the command to compile and build the Java application.
Real-World Use Case:
This kind of task is typically part of a larger CI/CD pipeline that:
1. Clones a Git repository (using a separate Git-clone task).
2. Builds the project with Maven (this step).
3. Runs tests or code scans.
4. Creates and pushes a container image.
Optional Enhancements:
You can extend this task by adding more steps such as:
· Code linting
· Running unit tests
· Creating build artifacts
Defining a Pipeline
A Pipeline in Tekton orchestrates the execution of multiple tasks in a defined sequence or parallel. It allows you to model complex workflows by connecting tasks, defining their order, and enabling parameter passing and workspace sharing.
Here’s a simple example of a pipeline that runs a single task named :
Key Points:
· kind: Pipeline defines a new pipeline resource.
· tasks: is a list where each item references a Task by name.
· taskRef: points to the pre-defined task to execute.
Extensibility:
A pipeline can include multiple tasks, executed either sequentially or in parallel by defining dependencies using or other features. Parameters and workspaces allow dynamic input and shared data across tasks.
Executing a Pipeline
Once you have a Pipeline defined, you trigger it by creating a PipelineRun resource. This represents a single execution instance of the pipeline.
Example YAML to run the :
How to Execute:
1. Save the YAML as and .
2. Apply the pipeline definition:
3. Apply the pipeline run to start execution:
4. Monitor the PipelineRun status either through:
o OpenShift Console under Pipelines section.
o CLI with:
· PipelineRun binds the pipeline definition to actual execution.
· Multiple PipelineRuns can be executed concurrently or sequentially.
· You can override parameters and workspaces in PipelineRun to customize each run.
Using Workspaces for Shared Data
In Tekton pipelines, Workspaces provide a mechanism to share data between tasks during pipeline execution. They are backed by Kubernetes volumes and allow multiple tasks and steps to access and persist files such as source code, build artifacts, or caches.
Why Use Workspaces?
· Enable sharing of files like source code or binaries between tasks.
· Allow caching of dependencies to speed up builds.
· Facilitate state persistence between steps or tasks in the same pipeline.
Example: Defining a Workspace in a PipelineRun
· name: The logical name of the workspace used in the pipeline or task.
· persistentVolumeClaim: Binds the workspace to a pre-existing PersistentVolumeClaim (PVC) named .
How Workspaces are Used in Tasks
Within a task or pipeline spec, workspaces are declared and then mounted inside containers. For example:
And in a step, you can use the workspace path (default ) to access shared files:
Workspaces are a critical feature in Tekton that enable seamless file sharing and persistence across tasks in a pipeline, making it possible to coordinate complex build and deployment workflows efficiently.
Adding Parameters to Pipelines
Parameters add flexibility and reusability to Tekton pipelines by allowing you to pass dynamic input values at runtime. This enables a single pipeline or task definition to serve multiple use cases or environments.
Defining Parameters in a Pipeline
Here’s an example of a pipeline that accepts a string parameter named :
Using Parameters in Tasks
Within the task steps, parameters are accessed using the syntax . For example:
This will print the value passed for the parameter during execution.
Passing Parameters in PipelineRun
When you trigger a pipeline, you can override default parameter values via the resource:
This causes the task to print instead of the default message.
· Parameters make pipelines dynamic and configurable.
· They enable reuse of pipeline definitions across different projects and environments.
· Parameters can be of various types (string, array, etc.) and have default values.
Tekton Triggers for GitOps
In modern DevOps workflows, automation is key. Tekton Triggers allow you to automate pipeline executions in response to external events such as Git commits, pull requests, or other webhook events, enabling true GitOps-style continuous delivery.
Core Components of Tekton Triggers:
· EventListener A Kubernetes resource that listens for incoming HTTP events (webhooks) from external systems like GitHub, GitLab, or Bitbucket.
· TriggerTemplate Defines the blueprint for the resources to be created (e.g., a ) when an event is received.
· TriggerBinding Maps data from the incoming event payload (like branch name, commit ID) to the parameters used in the TriggerTemplate.
How It Works:
1. A developer pushes code to a Git repository.
2. Git sends a webhook event to the EventListener.
3. The TriggerBinding extracts relevant details from the payload.
4. The TriggerTemplate uses those details to create a customized .
5. Tekton runs the pipeline automatically without manual intervention.
Benefits:
· Enables event-driven CI/CD pipelines.
· Automates deployment workflows tied directly to source code changes.
· Integrates with existing GitOps workflows for continuous deployment.
Example: GitHub Webhook Trigger
Tekton Triggers can automate pipeline execution based on GitHub events like code pushes or pull requests. Below is a sample definition to receive GitHub webhook events:
· EventListener listens for incoming HTTP requests from GitHub.
· TriggerBinding () maps GitHub event payload fields to Tekton parameters.
· TriggerTemplate () defines the resource to create upon receiving the event.
Setting up the GitHub Webhook
1. Deploy the EventListener in your OpenShift cluster.
2. Expose it externally using an OpenShift Route.
3. In your GitHub repository, configure a webhook pointing to the EventListener’s route URL.
4. Choose the events to trigger the webhook (e.g., push, pull request).
5. Once set up, pipeline runs will be triggered automatically by GitHub events.
CI/CD Use Case – Build & Deploy Java App
A typical CI/CD pipeline for a Java application using Tekton on OpenShift includes the following sequential tasks:
Clone the Git Repository Checkout the source code using a Git-clone task.
Build the Application with Maven Compile and package the Java app using a Maven build task.
Build the Container Image Use a task to build a Docker or OCI-compliant container image.
Push the Image to a Registry Push the built image to an image registry such as OpenShift’s integrated registry or Docker Hub.
Deploy to OpenShift Deploy or update the application in the OpenShift cluster using deployment tasks.
Each of these steps is defined as an individual Task, and they are orchestrated in a Pipeline to enable automated and repeatable builds and deployments.
Best Practices
To build maintainable, scalable, and secure pipelines, consider the following best practices:
Use Shared Workspaces Share source code, artifacts, and caches between tasks efficiently to avoid redundant work.
Parameterize Pipelines Make pipelines dynamic by using parameters for environment-specific values like image tags or URLs.
Manage Secrets Securely Store sensitive data such as credentials and tokens securely using OpenShift Secrets and inject them as environment variables or files.
Separate Build and Deploy Stages Keep build and deployment logic in distinct tasks or pipelines to improve modularity and reusability.
Leverage Tekton Catalog Use pre-built, community-maintained reusable tasks from Tekton Catalog to speed up pipeline development.
Conclusion and Learning Resources
Tekton Pipelines integrated with OpenShift provide a powerful, scalable, and Kubernetes-native platform for automating CI/CD workflows. By leveraging Tekton’s declarative pipeline definitions and event-driven triggers, teams can achieve faster, more reliable software delivery aligned with modern DevOps and GitOps practices.
Recommended Learning Resources:
· Tekton Documentation: https://tekton.dev/docs/
· OpenShift Pipelines Documentation: https://docs.openshift.com/container-platform/latest/cicd/pipelines/
· Tekton Catalog (Reusable Tasks): https://hub.tekton.dev
System Engineer || Outsourced at ADNOC Group | Red Hat OpenShift | Azure AKS & DevOps & DevSecOps | Observability & Monitoring
1wThanks for sharing, Dhinesh Kumar
OpenShift Administrator | Kubernetes | GitOps (ArgoCD, Helm, Kustomize) | Cluster Setup (UPI/IPI) | Prometheus | CI/CD | DevOps
1w💡 Great insight
Pls send
Full stride Cloud IoT | DevOps | Learning MLOps | Linux | AWS | Azure | GCP | CI/CD | Ansible | Docker | Kubernetes | Git | Jenkins | ELK | AppDynamics | Bash Scripting | Python | Terraform | Oracle DBA | PostgreSQL
1wPlease send
Unix/linux/Solaris Administrator
1wPdf