Category "pipeline"

SONAR_HOST_URL not reachable in bitbucket pipeline sonarqube

I am trying to integrate the sonarqube to bitbucket pipeline, and have following code there - pipe: sonarsource/sonarqube-scan:1.0.0 variables:

AttributeError: 'numpy.ndarray' object has no attribute 'transform'

I want to create a sklearn pipeline that consists of two steps: Custom transformer function Keras classification model This is my data set (of course, I'm provi

How to reference local files in a Gitlab pipeline (File does not exist error message)

I have the next configuration in my gitlab proyect: cont_eval: variables: DOCKER_TLS_CERTDIR: "/certs" stage: cont_eval

What is the difference between BZ and BNZ in instruction pipeline?

I am confused between branching instructions BZ and BNZ. Can anybody, please, explain the concept and working of BZ and BNZ with an example?

Tekton pipeline conditional run

I have a pipeline with following tasks. - name: A taskRef: name: buildah-secondary-tag-task runAfter: - maven-prepare-package

I want to run gitlab-runner like executing commands hook only

I have a dedicated server and attached Git-Runner. We just want to run something like this on a file path git pull npm install npm run prod the problem is git-

How to fix? "kex_exchange_identification: read: Connection reset by peer"

I want to copy data with scp in GitLab pipeline using PRIVATE_KEY error is : kex_exchange_identification: read: Connection reset by peer Connection reset by x.x

Need to perform AWS calls for account xxx, but no credentials have been configured

I'm trying to deploy my stack to aws using cdk deploy my-stack. When doing it in my terminal window it works perfectly, but when im doing it in my pipeline i ge

GitLab pass variable from one pipeline to another

We have a master pipeline, which is responsible for triggering pipelines from multiple projects and performing some steps. I want to pass a file from first pipe

On Premise MLOps Pipeline Stack

My motive is to build a MLOps pipeline which is 100% independnt from Cloud service like AWS, GCP and Azure. I have a project for a client in a production factor

Azure DevOps Pipeline configuration for Dynamic 365

Need help in designing CI/CD pipeline for deploying the Dynamic-365 solution in different environment. Present set up CI/CD configuration: we have 4 BU’s,

Accept merge request without running manual stages

I have a pipeline with 3 stages: build, deploy-test and deploy-prod. I want stages to have following behavior: always run build run deploy-test automatically

How to install gstreamer 1.0 badplugin for opencv?

On a Ubuntu 18.04 machine I am trying to use opencv 4.1.2 facedetect in a gstreamer 1.14.5 pipeline but unfortunately the plugin is not installed. I downloaded

How to properly pickle sklearn pipeline when using custom transformer

I am trying to pickle a sklearn machine-learning model, and load it in another project. The model is wrapped in pipeline that does feature encoding, scaling etc

Required context class hudson.FilePath is missing Perhaps you forgot to surround the code with a step that provides this, such as: node

When i load another groovy file in Jenkinsfile it show me following error. "Required context class hudson.FilePath is missing Perhaps you forgot to surround th

ValueError: A given column is not a column of the dataframe in pipeline and columntransformer

I am working on the toy dataset with ColumnTransformer and pipeline but I came across the error which I couldn't find a solution on the internet. toy = pd.read_

how to pass parameter to python script from a pipeline [closed]

I am building an Azure Data Factory pipeline and I would like to know how to get this parameter into the python script. The python script is l

How can i impelement SMOTE inside a columnTransformer?

I'm trying to implement SMOTENC inside a column transformer. However I'm getting error. The code and the error is provided below. #Create a mask for categorical

Kubeflow Pipelines error on GCP - Run doesn't end

After submitting the "run" using Jupyter notebook, when I go to the Kubeflow pipeline dashboard, I can see my "run" submitted & running but it doesn't end e

python zeromq Parallel Pipeline multiple consumers (workers)

Hello together! I would like to combine data from different python programs via zeromq. I think for that job the best solution would be parallel pipelines as ar