I am trying to integrate the sonarqube to bitbucket pipeline, and have following code there - pipe: sonarsource/sonarqube-scan:1.0.0 variables:
I want to create a sklearn pipeline that consists of two steps: Custom transformer function Keras classification model This is my data set (of course, I'm provi
I have the next configuration in my gitlab proyect: cont_eval: variables: DOCKER_TLS_CERTDIR: "/certs" stage: cont_eval
I am confused between branching instructions BZ and BNZ. Can anybody, please, explain the concept and working of BZ and BNZ with an example?
I have a pipeline with following tasks. - name: A taskRef: name: buildah-secondary-tag-task runAfter: - maven-prepare-package
I have a dedicated server and attached Git-Runner. We just want to run something like this on a file path git pull npm install npm run prod the problem is git-
I want to copy data with scp in GitLab pipeline using PRIVATE_KEY error is : kex_exchange_identification: read: Connection reset by peer Connection reset by x.x
I'm trying to deploy my stack to aws using cdk deploy my-stack. When doing it in my terminal window it works perfectly, but when im doing it in my pipeline i ge
We have a master pipeline, which is responsible for triggering pipelines from multiple projects and performing some steps. I want to pass a file from first pipe
My motive is to build a MLOps pipeline which is 100% independnt from Cloud service like AWS, GCP and Azure. I have a project for a client in a production factor
Need help in designing CI/CD pipeline for deploying the Dynamic-365 solution in different environment. Present set up CI/CD configuration: we have 4 BU’s,
I have a pipeline with 3 stages: build, deploy-test and deploy-prod. I want stages to have following behavior: always run build run deploy-test automatically
On a Ubuntu 18.04 machine I am trying to use opencv 4.1.2 facedetect in a gstreamer 1.14.5 pipeline but unfortunately the plugin is not installed. I downloaded
I am trying to pickle a sklearn machine-learning model, and load it in another project. The model is wrapped in pipeline that does feature encoding, scaling etc
When i load another groovy file in Jenkinsfile it show me following error. "Required context class hudson.FilePath is missing Perhaps you forgot to surround th
I am working on the toy dataset with ColumnTransformer and pipeline but I came across the error which I couldn't find a solution on the internet. toy = pd.read_
I am building an Azure Data Factory pipeline and I would like to know how to get this parameter into the python script. The python script is l
I'm trying to implement SMOTENC inside a column transformer. However I'm getting error. The code and the error is provided below. #Create a mask for categorical
After submitting the "run" using Jupyter notebook, when I go to the Kubeflow pipeline dashboard, I can see my "run" submitted & running but it doesn't end e
Hello together! I would like to combine data from different python programs via zeromq. I think for that job the best solution would be parallel pipelines as ar